959 resultados para Complexity analysis
Resumo:
Background: The inherent complexity of statistical methods and clinical phenomena compel researchers with diverse domains of expertise to work in interdisciplinary teams, where none of them have a complete knowledge in their counterpart's field. As a result, knowledge exchange may often be characterized by miscommunication leading to misinterpretation, ultimately resulting in errors in research and even clinical practice. Though communication has a central role in interdisciplinary collaboration and since miscommunication can have a negative impact on research processes, to the best of our knowledge, no study has yet explored how data analysis specialists and clinical researchers communicate over time. Methods/Principal Findings: We conducted qualitative analysis of encounters between clinical researchers and data analysis specialists (epidemiologist, clinical epidemiologist, and data mining specialist). These encounters were recorded and systematically analyzed using a grounded theory methodology for extraction of emerging themes, followed by data triangulation and analysis of negative cases for validation. A policy analysis was then performed using a system dynamics methodology looking for potential interventions to improve this process. Four major emerging themes were found. Definitions using lay language were frequently employed as a way to bridge the language gap between the specialties. Thought experiments presented a series of ""what if'' situations that helped clarify how the method or information from the other field would behave, if exposed to alternative situations, ultimately aiding in explaining their main objective. Metaphors and analogies were used to translate concepts across fields, from the unfamiliar to the familiar. Prolepsis was used to anticipate study outcomes, thus helping specialists understand the current context based on an understanding of their final goal. Conclusion/Significance: The communication between clinical researchers and data analysis specialists presents multiple challenges that can lead to errors.
Resumo:
During the early Holocene two main paleoamerican cultures thrived in Brazil: the Tradicao Nordeste in the semi-desertic Sertao and the Tradicao Itaparica in the high plains of the Planalto Central. Here we report on paleodietary singals of a Paleoamerican found in a third Brazilian ecological setting - a riverine shellmound, or sambaqui, located in the Atlantic forest. Most sambaquis are found along the coast. The peoples associated with them subsisted on marine resources. We are reporting a different situation from the oldest recorded riverine sambaqui, called Capelinha. Capelinha is a relatively small sambaqui established along a river 60 km from the Atlantic Ocean coast. It contained the well-preserved remains of a Paleoamerican known as Luzio dated to 9,945 +/- 235 years ago; the oldest sambaqui dweller so far. Luzio's bones were remarkably well preserved and allowed for stable isotopic analysis of diet. Although artifacts found at this riverine site show connections with the Atlantic coast, we show that he represents a population that was dependent on inland resources as opposed to marine coastal resources. After comparing Luzio's paleodietary data with that of other extant and prehistoric groups, we discuss where his group could have come from, if terrestrial diet persisted in riverine sambaquis and how Luzio fits within the discussion of the replacement of paleamerican by amerindian morphology. This study adds to the evidence that shows a greater complexity in the prehistory of the colonization of and the adaptations to the New World.
Resumo:
Alternative splicing of gene transcripts greatly expands the functional capacity of the genome, and certain splice isoforms may indicate specific disease states such as cancer. Splice junction microarrays interrogate thousands of splice junctions, but data analysis is difficult and error prone because of the increased complexity compared to differential gene expression analysis. We present Rank Change Detection (RCD) as a method to identify differential splicing events based upon a straightforward probabilistic model comparing the over-or underrepresentation of two or more competing isoforms. RCD has advantages over commonly used methods because it is robust to false positive errors due to nonlinear trends in microarray measurements. Further, RCD does not depend on prior knowledge of splice isoforms, yet it takes advantage of the inherent structure of mutually exclusive junctions, and it is conceptually generalizable to other types of splicing arrays or RNA-Seq. RCD specifically identifies the biologically important cases when a splice junction becomes more or less prevalent compared to other mutually exclusive junctions. The example data is from different cell lines of glioblastoma tumors assayed with Agilent microarrays.
Resumo:
Aging is known to have a degrading influence on many structures and functions of the human sensorimotor system. The present work assessed aging-related changes in postural sway using fractal and complexity measures of the center of pressure (COP) dynamics with the hypothesis that complexity and fractality decreases in the older individuals. Older subjects (68 +/- 4 years) and young adult subjects (28 +/- 7 years) performed a quiet stance task (60 s) and a prolonged standing task (30 min) where subjects were allowed to move freely. Long-range correlations (fractality) of the data were estimated by the detrended fluctuation analysis (DFA); changes in entropy were estimated by the multi-scale entropy (MSE) measure. The DFA results showed that the fractal dimension was lower for the older subjects in comparison to the young adults but the fractal dimensions of both groups were not different from a 1/f noise, for time intervals between 10 and 600 s. The MSE analysis performed with the typically applied adjustment to the criterion distance showed a higher degree of complexity in the older subjects, which is inconsistent with the hypothesis that complexity in the human physiological system decreases with aging. The same MSE analysis performed without adjustment showed no differences between the groups. Taken all results together, the decrease in total postural sway and long-range correlations in older individuals are signs of an adaptation process reflecting the diminishing ability to generate adequate responses on a longer time scale.
Resumo:
Since their discovery 150 years ago, Neanderthals have been considered incapable of behavioural change and innovation. Traditional synchronic approaches to the study of Neanderthal behaviour have perpetuated this view and shaped our understanding of their lifeways and eventual extinction. In this thesis I implement an innovative diachronic approach to the analysis of Neanderthal faunal extraction, technology and symbolic behaviour as contained in the archaeological record of the critical period between 80,000 and 30,000 years BP. The thesis demonstrates patterns of change in Neanderthal behaviour which are at odds with traditional perspectives and which are consistent with an interpretation of increasing behavioural complexity over time, an idea that has been suggested but never thoroughly explored in Neanderthal archaeology. Demonstrating an increase in behavioural complexity in Neanderthals provides much needed new data with which to fuel the debate over the behavioural capacities of Neanderthals and the first appearance of Modern Human Behaviour in Europe. It supports the notion that Neanderthal populations were active agents of behavioural innovation prior to the arrival of Anatomically Modern Humans in Europe and, ultimately, that they produced an early Upper Palaeolithic cultural assemblage (the Châtelperronian) independent of modern humans. Overall, this thesis provides an initial step towards the development of a quantitative approach to measuring behavioural complexity which provides fresh insights into the cognitive and behavioural capabilities of Neanderthals.
Resumo:
Diachronic approaches provide potential for a more sophisticated framework within which to examine change in Neanderthal behavioural complexity using archaeological proxies such as symbolic artefacts, faunal assemblages and technology. Analysis of the temporal appearance and distribution of such artefacts and assemblages provide the basis for identifying changes in Neanderthal behavioural complexity in terms of symbolism, faunal extraction and technology respectively. Although changes in technology and faunal extraction were examined in the wider study, only the results of the symbolic study are presented below to illustrate the potential of the approach.
Resumo:
This paper presents an analysis of dysfluencies in two oral tellings of a familiar children's story by a young boy with autism. Thurber & Tager-Flusberg (1993) postulate a lower degree of cognitive and communicative investment to explain a lower frequency of non-grammatical pauses observed in elicited narratives of children with autism in comparison to typically developing and intellectually disabled controls. we also found a very low frequency of non-grammatical pauses in our data, but indications of high engagement and cognitive and communicative investment. We point to a wider range of disfluencies as indicators of cognitive load, and show that the kind and location of dysfluencies produced may reveal which aspects of the narrative task are creating the greatest cognitive demand: here, mental state ascription, perspectivization, and adherence to story schema. This paper thus generates analytical options and hypotheses that can be explored further in a larger population of children with autism and typically developing controls.
Resumo:
The common approach of bioelectrical impedance analysis to estimate body water uses a wrist-to-ankle methodology which, although not indicated by theory, has the advantage of ease of application particularly for clinical studies involving patients with debilitating diseases. A number of authors have suggested the use of a segmental protocol in which the impedances of the trunk and limbs are measured separately to provide a methodology more in keeping with basic theory. The segmental protocol hits not, however, been generally adopted, partly because of the increased complexity involved in its application, and partly because studies comparing the two methodologies have not clearly demonstrated a significant improvement from the segmental methodology. We have conducted a small pilot study involving ten subjects to investigate the efficacy of the two methodologies in a group of normal subjects. The study did not require the independent measure of body water, by for example isotope dilution, as the subjects were maintained in a state of constant hydration with only the distribution between limbs and trunk changing as a result of change in posture. The results demonstrate a significant difference between the two methodologies in predicting the expected constancy of body water in this study, with the segmental methodology indicating a mean percentage change in extracellular water of -2.2%; which was not significantly different from the expected null result, whereas the wrist-to-ankle methodology indicated a mean percentage change in extracellular water of -6.6%. This is significantly different from the null result, and from the value obtained from the segmental methodology (p = 0.006). Similar results were obtained using estimates of total body water from the two methodologies. (C) 1998 Elsevier Science Ltd. All rights reserved.
Resumo:
Objective: The aim of this article is to propose an integrated framework for extracting and describing patterns of disorders from medical images using a combination of linear discriminant analysis and active contour models. Methods: A multivariate statistical methodology was first used to identify the most discriminating hyperplane separating two groups of images (from healthy controls and patients with schizophrenia) contained in the input data. After this, the present work makes explicit the differences found by the multivariate statistical method by subtracting the discriminant models of controls and patients, weighted by the pooled variance between the two groups. A variational level-set technique was used to segment clusters of these differences. We obtain a label of each anatomical change using the Talairach atlas. Results: In this work all the data was analysed simultaneously rather than assuming a priori regions of interest. As a consequence of this, by using active contour models, we were able to obtain regions of interest that were emergent from the data. The results were evaluated using, as gold standard, well-known facts about the neuroanatomical changes related to schizophrenia. Most of the items in the gold standard was covered in our result set. Conclusions: We argue that such investigation provides a suitable framework for characterising the high complexity of magnetic resonance images in schizophrenia as the results obtained indicate a high sensitivity rate with respect to the gold standard. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
C. L. Isaac and A. R. Mayes (1999a, 1999b) compared forgetting rates in amnesic patients and normal participants across a range of memory tasks. Although the results are complex, many of them appear to be replicable and there are several commendable features to the design and analysis. Nevertheless, the authors largely ignored 2 relevant literatures: the traditional literature on proactive inhibition/interference and the formal analyses of the complexity of the bindings (associations) required for memory tasks. It is shown how the empirical results and conceptual analyses in these literatures are needed to guide the choice of task, the design of experiments, and the interpretation of results for amnesic patients and normal participants.
Resumo:
Fault detection and isolation (FDI) are important steps in the monitoring and supervision of industrial processes. Biological wastewater treatment (WWT) plants are difficult to model, and hence to monitor, because of the complexity of the biological reactions and because plant influent and disturbances are highly variable and/or unmeasured. Multivariate statistical models have been developed for a wide variety of situations over the past few decades, proving successful in many applications. In this paper we develop a new monitoring algorithm based on Principal Components Analysis (PCA). It can be seen equivalently as making Multiscale PCA (MSPCA) adaptive, or as a multiscale decomposition of adaptive PCA. Adaptive Multiscale PCA (AdMSPCA) exploits the changing multivariate relationships between variables at different time-scales. Adaptation of scale PCA models over time permits them to follow the evolution of the process, inputs or disturbances. Performance of AdMSPCA and adaptive PCA on a real WWT data set is compared and contrasted. The most significant difference observed was the ability of AdMSPCA to adapt to a much wider range of changes. This was mainly due to the flexibility afforded by allowing each scale model to adapt whenever it did not signal an abnormal event at that scale. Relative detection speeds were examined only summarily, but seemed to depend on the characteristics of the faults/disturbances. The results of the algorithms were similar for sudden changes, but AdMSPCA appeared more sensitive to slower changes.
Resumo:
Observational data collected in the Lake Tekapo hydro catchment of the Southern Alps in New Zealand are used to analyse the wind and temperature fields in the alpine lake basin during summertime fair weather conditions. Measurements from surface stations, pilot balloon and tethersonde soundings, Doppler sodar and an instrumented light aircraft provide evidence of multi-scale interacting wind systems, ranging from microscale slope winds to mesoscale coast-to-basin flows. Thermal forcing of the winds occurred due to differential heating as a consequence of orography and heterogeneous surface features, which is quantified by heat budget and pressure field analysis. The daytime vertical temperature structure was characterised by distinct layering. Features of particular interest are the formation of thermal internal boundary layers due to the lake-land discontinuity and the development of elevated mixed layers. The latter were generated by advective heating from the basin and valley sidewalls by slope winds and by a superimposed valley wind blowing from the basin over Lake Tekapo and up the tributary Godley Valley. Daytime heating in the basin and its tributary valleys caused the development of a strong horizontal temperature gradient between the basin atmosphere and that over the surrounding landscape, and hence the development of a mesoscale heat low over the basin. After noon, air from outside the basin started flowing over mountain saddles into the basin causing cooling in the lowest layers, whereas at ridge top height the horizontal air temperature gradient between inside and outside the basin continued to increase. In the early evening, a more massive intrusion of cold air caused rapid cooling and a transition to a rather uniform slightly stable stratification up to about 2000 m agl. The onset time of this rapid cooling varied about 1-2 h between observation sites and was probably triggered by the decay of up-slope winds inside the basin, which previously countered the intrusion of air over the surrounding ridges. The intrusion of air from outside the basin continued until about mid-night, when a northerly mountain wind from the Godley Valley became dominant. The results illustrate the extreme complexity that can be caused by the operation of thermal forcing processes at a wide range of spatial scales.
Resumo:
We analyzed the mouse Representative Transcript and Protein Set for molecules involved in brain function. We found full-length cDNAs of many known brain genes and discovered new members of known brain gene families, including Family 3 G-protein coupled receptors, voltage-gated channels, and connexins. We also identified previously unknown candidates for secreted neuroactive molecules. The existence of a large number of unique brain ESTs suggests an additional molecular complexity that remains to be explored. A list of genes containing CAG stretches in the coding region represents a first step in the potential identification of candidates for hereditary neurological disorders.
Resumo:
The branching structure of neurones is thought to influence patterns of connectivity and how inputs are integrated within the arbor. Recent studies have revealed a remarkable degree of variation in the branching structure of pyramidal cells in the cerebral cortex of diurnal primates, suggesting regional specialization in neuronal function. Such specialization in pyramidal cell structure may be important for various aspects of visual function, such as object recognition and color processing. To better understand the functional role of regional variation in the pyramidal cell phenotype in visual processing, we determined the complexity of the dendritic branching pattern of pyramidal cells in visual cortex of the nocturnal New World owl monkey. We used the fractal dilation method to quantify the branching structure of pyramidal cells in the primary visual area (V1), the second visual area (V2) and the caudal and rostral subdivisions of inferotemporal cortex (ITc and ITr, respectively), which are often associated with color processing. We found that, as in diurnal monkeys, there was a trend for cells of increasing fractal dimension with progression through these cortical areas. The increasing complexity paralleled a trend for increasing symmetry. That we found a similar trend in both diurnal and nocturnal monkeys suggests that it was a feature of a common anthropoid ancestor.
Resumo:
Within the development of motor vehicles, crash safety (e.g. occupant protection, pedestrian protection, low speed damageability), is one of the most important attributes. In order to be able to fulfill the increased requirements in the framework of shorter cycle times and rising pressure to reduce costs, car manufacturers keep intensifying the use of virtual development tools such as those in the domain of Computer Aided Engineering (CAE). For crash simulations, the explicit finite element method (FEM) is applied. The accuracy of the simulation process is highly dependent on the accuracy of the simulation model, including the midplane mesh. One of the roughest approximations typically made is the actual part thickness which, in reality, can vary locally. However, almost always a constant thickness value is defined throughout the entire part due to complexity reasons. On the other hand, for precise fracture analysis within FEM, the correct thickness consideration is one key enabler. Thus, availability of per element thickness information, which does not exist explicitly in the FEM model, can significantly contribute to an improved crash simulation quality, especially regarding fracture prediction. Even though the thickness is not explicitly available from the FEM model, it can be inferred from the original CAD geometric model through geometric calculations. This paper proposes and compares two thickness estimation algorithms based on ray tracing and nearest neighbour 3D range searches. A systematic quantitative analysis of the accuracy of both algorithms is presented, as well as a thorough identification of particular geometric arrangements under which their accuracy can be compared. These results enable the identification of each technique’s weaknesses and hint towards a new, integrated, approach to the problem that linearly combines the estimates produced by each algorithm.