977 resultados para analytical approaches
Resumo:
The brain is a complex neural network with a hierarchical organization and the mapping of its elements and connections is an important step towards the understanding of its function. Recent developments in diffusion-weighted imaging have provided the opportunity to reconstruct the whole-brain structural network in-vivo at a large scale level and to study the brain structural substrate in a framework that is close to the current understanding of brain function. However, methods to construct the connectome are still under development and they should be carefully evaluated. To this end, the first two studies included in my thesis aimed at improving the analytical tools specific to the methodology of brain structural networks. The first of these papers assessed the repeatability of the most common global and local network metrics used in literature to characterize the connectome, while in the second paper the validity of further metrics based on the concept of communicability was evaluated. Communicability is a broader measure of connectivity which accounts also for parallel and indirect connections. These additional paths may be important for reorganizational mechanisms in the presence of lesions as well as to enhance integration in the network. These studies showed good to excellent repeatability of global network metrics when the same methodological pipeline was applied, but more variability was detected when considering local network metrics or when using different thresholding strategies. In addition, communicability metrics have been found to add some insight into the integration properties of the network by detecting subsets of nodes that were highly interconnected or vulnerable to lesions. The other two studies used methods based on diffusion-weighted imaging to obtain knowledge concerning the relationship between functional and structural connectivity and about the etiology of schizophrenia. The third study integrated functional oscillations measured using electroencephalography (EEG) and functional magnetic resonance imaging (fMRI) as well as diffusion-weighted imaging data. The multimodal approach that was applied revealed a positive relationship between individual fluctuations of the EEG alpha-frequency and diffusion properties of specific connections of two resting-state networks. Finally, in the fourth study diffusion-weighted imaging was used to probe for a relationship between the underlying white matter tissue structure and season of birth in schizophrenia patients. The results are in line with the neurodevelopmental hypothesis of early pathological mechanisms as the origin of schizophrenia. The different analytical approaches selected in these studies also provide arguments for discussion of the current limitations in the analysis of brain structural networks. To sum up, the first studies presented in this thesis illustrated the potential of brain structural network analysis to provide useful information on features of brain functional segregation and integration using reliable network metrics. In the other two studies alternative approaches were presented. The common discussion of the four studies enabled us to highlight the benefits and possibilities for the analysis of the connectome as well as some current limitations.
Resumo:
The momentary, global functional state of the brain is reflected by its electric field configuration. Cluster analytical approaches consistently extracted four head-surface brain electric field configurations that optimally explain the variance of their changes across time in spontaneous EEG recordings. These four configurations are referred to as EEG microstate classes A, B, C, and D and have been associated with verbal/phonological, visual, attention reorientation, and subjective interoceptive-autonomic processing, respectively. The present study tested these associations via an intra-individual and inter-individual analysis approach. The intra-individual approach tested the effect of task-induced increased modality-specific processing on EEG microstate parameters. The inter-individual approach tested the effect of personal modality-specific parameters on EEG microstate parameters. We obtained multichannel EEG from 61 healthy, right-handed, male students during four eyes-closed conditions: object-visualization, spatial-visualization, verbalization (6 runs each), and resting (7 runs). After each run, we assessed participants' degrees of object-visual, spatial-visual, and verbal thinking using subjective reports. Before and after the recording, we assessed modality-specific cognitive abilities and styles using nine cognitive tests and two questionnaires. The EEG of all participants, conditions, and runs was clustered into four classes of EEG microstates (A, B, C, and D). RMANOVAs, ANOVAs and post-hoc paired t-tests compared microstate parameters between conditions. TANOVAs compared microstate class topographies between conditions. Differences were localized using eLORETA. Pearson correlations assessed interrelationships between personal modality-specific parameters and EEG microstate parameters during no-task resting. As hypothesized, verbal as opposed to visual conditions consistently affected the duration, occurrence, and coverage of microstate classes A and B. Contrary to associations suggested by previous reports, parameters were increased for class A during visualization, and class B during verbalization. In line with previous reports, microstate D parameters were increased during no-task resting compared to the three internal, goal-directed tasks. Topographic differences between conditions concerned particular sub-regions of components of the metabolic default mode network. Modality-specific personal parameters did not consistently correlate with microstate parameters except verbal cognitive style which correlated negatively with microstate class A duration and positively with class C occurrence. This is the first study that aimed to induce EEG microstate class parameter changes based on their hypothesized functional significance. Beyond, the associations of microstate classes A and B with visual and verbal processing, respectively and microstate class D with interoceptive-autonomic processing, our results suggest that a finely-tuned interplay between all four EEG microstate classes is necessary for the continuous formation of visual and verbal thoughts, as well as interoceptive-autonomic processing. Our results point to the possibility that the EEG microstate classes may represent the head-surface measured activity of intra-cortical sources primarily exhibiting inhibitory functions. However, additional studies are needed to verify and elaborate on this hypothesis.
Resumo:
The paper provides a fairly comprehensive examination of recent empirical work on discrimination within economics. The three major analytical approaches considered are traditional regression analysis of outcomes, paired testing or audits, and finally analysis of performance where higher group performance suggests that a group has been treated disfavorably. The review covers research in the labor, credit, and consumption markets, as well as recent studies of discrimination within the legal system. The review suggests that the validity of interpreting observed racial differences as discrimination depends heavily on whether the analysis is based on a sample that is representative of a population of individuals or households or based on a sample of market transactions, as well as the analyst?s ability to control for heterogeneity within that sample. Heterogeneous firm behavior and differentiated products, such as those found in labor and housing markets, also can confound empirical analyses of discrimination by confusing the allocation of individuals across firms or products with disparate treatment or by ignoring disparate impacts that might arise based on that allocation.
Resumo:
Strategic control is defined as the use of qualitative and quantitative tools for the evaluation of strategic organizational performance. Most research in strategic planning has focused on strategy formulation and implementation, but little work has been done on strategic performance evaluation particularly in the area of cancer research. The objective of this study was to identify strategic control approaches and financial performance metrics used by major cancer centers in the country as an initial step in expanding the theory and practice behind strategic organizational performance. Focusing on hospitals which share similar mandate and resource constraints was expected to improve measurement precision. The results indicate that most cancer centers use a wide selection of evaluation tools, but sophisticated analytical approaches were less common. In addition, there was evidence that high-performing centers tend to invest a larger degree of resources in the area of strategic performance analysis than centers showing lower financial results. The conclusions point to the need for incorporating higher degree of analytical power in order to improve the tracking of strategic performance. This study is one of the first to concentrate in the area of strategic control.^
Resumo:
Birth defects are the leading cause of infant mortality in the United States and are a major cause of lifetime disability. However, efforts to understand their causes have been hampered by a lack of population-specific data. During 1990–2004, 22 state legislatures responded to this need by proposing birth defects surveillance legislation (BDSL). The contrast between these states and those that did not pass BDSL provides an opportunity to better understand conditions associated with US public health policy diffusion. ^ This study identifies key state-specific determinants that predict: (1) the introduction of birth defects surveillance legislation (BDSL) onto states' formal legislative agenda, and (2) the successful adoption of these laws. Secondary aims were to interpret these findings in a theoretically sound framework and to incorporate evidence from three analytical approaches. ^ The study begins with a comparative case study of Texas and Oregon (states with divergent BDSL outcomes), including a review of historical documentation and content analysis of key informant interviews. After selecting and operationalizing explanatory variables suggested by the case study, Qualitative Comparative Analysis (QCA) was applied to publically available data to describe important patterns of variation among 37 states. Results from logistic regression were compared to determine whether the two methods produced consistent findings. ^ Themes emerging from the comparative case study included differing budgetary conditions and the significance of relationships within policy issue networks. However, the QCA and statistical analysis pointed to the importance of political parties and contrasting societal contexts. Notably, state policies that allow greater access to citizen-driven ballot initiatives were consistently associated with lower likelihood of introducing BDSL. ^ Methodologically, these results indicate that a case study approach, while important for eliciting valuable context-specific detail, may fail to detect the influence of overarching, systemic variables, such as party competition. However, QCA and statistical analyses were limited by a lack of existing data to operationalize policy issue networks, and thus may have downplayed the impact of personal interactions. ^ This study contributes to the field of health policy studies in three ways. First, it emphasizes the importance of collegial and consistent relationships among policy issue network members. Second, it calls attention to political party systems in predicting policy outcomes. Finally, a novel approach to interpreting state data in a theoretically significant manner (QCA) has been demonstrated.^
Resumo:
We present and examine a multi-sensor global compilation of mid-Holocene (MH) sea surface temperatures (SST), based on Mg/Ca and alkenone palaeothermometry and reconstructions obtained using planktonic foraminifera and organic-walled dinoflagellate cyst census counts. We assess the uncertainties originating from using different methodologies and evaluate the potential of MH SST reconstructions as a benchmark for climate-model simulations. The comparison between different analytical approaches (time frame, baseline climate) shows the choice of time window for the MH has a negligible effect on the reconstructed SST pattern, but the choice of baseline climate affects both the magnitude and spatial pattern of the reconstructed SSTs. Comparison of the SST reconstructions made using different sensors shows significant discrepancies at a regional scale, with uncertainties often exceeding the reconstructed SST anomaly. Apparent patterns in SST may largely be a reflection of the use of different sensors in different regions. Overall, the uncertainties associated with the SST reconstructions are generally larger than the MH anomalies. Thus, the SST data currently available cannot serve as a target for benchmarking model simulations.
Resumo:
Terrestrial organic matter (OM) in pelagic sediments is discussed with regard to depositional processes and land-sea interactions in the modern and past glacial/interglacial Equatorial Atlantic. Special emphasis is placed on a critical evaluation of different analytical approaches (C/N, Rock-Eval Pyrolysis, stable carbon isotopes, palynology, organic petrology, and selected biomarkers) which are currently used for the qualitative and quantitative assessment of terrigenous organic carbon. If binary mixing equations are used to calculate terrestrial and marine proportions of organic carbon, we consider the definition of endmember values to be most critical since these values may be biased by a great number of independent controls. A combination of geochemical methods including optical studies (organic petrology and palynology) is therefore suggested to evaluate each individual proxy. Organic geochemical analyses performed on sediments from the modern and Late Quaternary Equatorial Atlantic evidence fluctuations in eolian supply of terrigenous OM related to changes in intensity of the trade winds. Quantification of this organic fraction leads to differing proportions depending on the approach applied, i.e. the organic carbon isotopic composition or maceral analyses. Modern distribution of terrigenous OM reveals a decrease in supply towards the basin contributing less than a fifth of the total OM in pelagic areas. Organic geochemical data indicate that sedimentation in the modern northeastern Brasil Basin is affected by lateral advection of reworked OM probably from southern source areas. Glacial/interglacial deposits from the pelagic Equatorial Atlantic (ODP Site 663), covering isotopic stages 12 and 11, reveal that deposition of terrigenous OM was higher under past glacial conditions, in correspondence to generally enhanced dust fluxes. Proportions of terrigenous OM, however, never exceed 50% of the total OM according to maceral analyses. Other estimates, recently proposed by Verardo and Ruddiman (1996), are considered to be too high probably for analytical reasons. Palynological records in the Equatorial Atlantic parallel dust records. Increased portions of grass pollen suggest the admixture of C4-plant material under modern and past glacial conditions. It is therefore assumed, as one possible interpetation, that C4-plant debris has an effect on sedimentary d13Corg and might explain differences between isotopic and microscopic quantitative estimates. Using the difference between these two records, we calculate that maximum supply of C4-material remains below 20% of the total OM for the deep modern and past glacial/interglacial Equatorial Atlantic.
Resumo:
Sedimentary sequences in ancient or long-lived lakes can reach several thousands of meters in thickness and often provide an unrivalled perspective of the lake's regional climatic, environmental, and biological history. Over the last few years, deep-drilling projects in ancient lakes became increasingly multi- and interdisciplinary, as, among others, seismological, sedimentological, biogeochemical, climatic, environmental, paleontological, and evolutionary information can be obtained from sediment cores. However, these multi- and interdisciplinary projects pose several challenges. The scientists involved typically approach problems from different scientific perspectives and backgrounds, and setting up the program requires clear communication and the alignment of interests. One of the most challenging tasks, besides the actual drilling operation, is to link diverse datasets with varying resolution, data quality, and age uncertainties to answer interdisciplinary questions synthetically and coherently. These problems are especially relevant when secondary data, i.e., datasets obtained independently of the drilling operation, are incorporated in analyses. Nonetheless, the inclusion of secondary information, such as isotopic data from fossils found in outcrops or genetic data from extant species, may help to achieve synthetic answers. Recent technological and methodological advances in paleolimnology are likely to increase the possibilities of integrating secondary information. Some of the new approaches have started to revolutionize scientific drilling in ancient lakes, but at the same time, they also add a new layer of complexity to the generation and analysis of sediment-core data. The enhanced opportunities presented by new scientific approaches to study the paleolimnological history of these lakes, therefore, come at the expense of higher logistic, communication, and analytical efforts. Here we review types of data that can be obtained in ancient lake drilling projects and the analytical approaches that can be applied to empirically and statistically link diverse datasets to create an integrative perspective on geological and biological data. In doing so, we highlight strengths and potential weaknesses of new methods and analyses, and provide recommendations for future interdisciplinary deep-drilling projects.
Resumo:
This work proposes an automatic methodology for modeling complex systems. Our methodology is based on the combination of Grammatical Evolution and classical regression to obtain an optimal set of features that take part of a linear and convex model. This technique provides both Feature Engineering and Symbolic Regression in order to infer accurate models with no effort or designer's expertise requirements. As advanced Cloud services are becoming mainstream, the contribution of data centers in the overall power consumption of modern cities is growing dramatically. These facilities consume from 10 to 100 times more power per square foot than typical office buildings. Modeling the power consumption for these infrastructures is crucial to anticipate the effects of aggressive optimization policies, but accurate and fast power modeling is a complex challenge for high-end servers not yet satisfied by analytical approaches. For this case study, our methodology minimizes error in power prediction. This work has been tested using real Cloud applications resulting on an average error in power estimation of 3.98%. Our work improves the possibilities of deriving Cloud energy efficient policies in Cloud data centers being applicable to other computing environments with similar characteristics.
Resumo:
As advanced Cloud services are becoming mainstream, the contribution of data centers in the overall power consumption of modern cities is growing dramatically. The average consumption of a single data center is equivalent to the energy consumption of 25.000 households. Modeling the power consumption for these infrastructures is crucial to anticipate the effects of aggressive optimization policies, but accurate and fast power modeling is a complex challenge for high-end servers not yet satisfied by analytical approaches. This work proposes an automatic method, based on Multi-Objective Particle Swarm Optimization, for the identification of power models of enterprise servers in Cloud data centers. Our approach, as opposed to previous procedures, does not only consider the workload consolidation for deriving the power model, but also incorporates other non traditional factors like the static power consumption and its dependence with temperature. Our experimental results shows that we reach slightly better models than classical approaches, but simul- taneously simplifying the power model structure and thus the numbers of sensors needed, which is very promising for a short-term energy prediction. This work, validated with real Cloud applications, broadens the possibilities to derive efficient energy saving techniques for Cloud facilities.
Resumo:
A simple analytical model for the train-induced flow and its effects on pedestrians is presented in this paper. The expressions developed for the induced air velocity and pressure on the pedestrian surface, as well as their dependence with time, are obtained from unsteady potential theory. The relevant parameters and their effects are analysed, in particular the sensitivity of the pressure coefficient and its rate of change on the train and pedestrian transverse size, the distance to the tracks and the pressure measurement location on the pedestrian surface. In spite of the extreme simplicity of the model and the expressions obtained, good correlation is observed with previously existing experiments. With this work, an absence of published studies concerning analytical approaches to the problem of vehicle-induced pressure on pedestrians is intended to be covered, allowing for simplified testing procedures.
Resumo:
A recent criticism that the biological species concept (BSC) unduly neglects phylogeny is examined under a novel modification of coalescent theory that considers multiple, sex-defined genealogical pathways through sexual organismal pedigrees. A competing phylogenetic species concept (PSC) also is evaluated from this vantage. Two analytical approaches are employed to capture the composite phylogenetic information contained within the braided assemblages of hereditary pathways of a pedigree: (i) consensus phylogenetic trees across allelic transmission routes and (ii) composite phenograms from quantitative values of organismal coancestry. Outcomes from both approaches demonstrate that the supposed sharp distinction between biological and phylogenetic species concepts is illusory. Historical descent and reproductive ties are related aspects of phylogeny and jointly illuminate biotic discontinuity.
Resumo:
Context. The X-ray spectra observed in the persistent emission of magnetars are evidence for the existence of a magnetosphere. The high-energy part of the spectra is explained by resonant cyclotron upscattering of soft thermal photons in a twisted magnetosphere, which has motivated an increasing number of efforts to improve and generalize existing magnetosphere models. Aims. We want to build more general configurations of twisted, force-free magnetospheres as a first step to understanding the role played by the magnetic field geometry in the observed spectra. Methods. First we reviewed and extended previous analytical works to assess the viability and limitations of semi-analytical approaches. Second, we built a numerical code able to relax an initial configuration of a nonrotating magnetosphere to a force-free geometry, provided any arbitrary form of the magnetic field at the star surface. The numerical code is based on a finite-difference time-domain, divergence-free, and conservative scheme, based of the magneto-frictional method used in other scenarios. Results. We obtain new numerical configurations of twisted magnetospheres, with distributions of twist and currents that differ from previous analytical solutions. The range of global twist of the new family of solutions is similar to the existing semi-analytical models (up to some radians), but the achieved geometry may be quite different. Conclusions. The geometry of twisted, force-free magnetospheres shows a wider variety of possibilities than previously considered. This has implications for the observed spectra and opens the possibility of implementing alternative models in simulations of radiative transfer aiming at providing spectra to be compared with observations.
Resumo:
Los modelos de nicho ecológico permiten estudiar el efecto del ambiente sobre la distribución de las especies, relacionando datos de su distribución con información ambiental. El objetivo del presente estudio fue estimar el nicho ecológico y describir la variabilidad en la distribución espacial de la anchoveta (Engraulis ringens) mediante el uso de modelos estadísticos de nicho ecológico. Se trabajó con dos enfoques de análisis: por stocks (norte, centro y sur) en el Pacífico Sudoriental (PSO) y por estadios de desarrollo (pre-reclutas, reclutas y adultos) en la costa peruana. El modelo de nicho ecológico utilizó modelos aditivos generalizados, estimaciones georeferenciadas de presencia y ausencia de anchoveta e información de cuatro variables ambientales (temperatura superficial del mar, salinidad superficial del mar, concentración de clorofila a superficial y la profundidad de la oxiclina) entre los a˜nos 1985 y 2008. Se encontró que no existen diferencias en los nichos ecológicos de los tres stocks de anchoveta siendo los modelos que utilizaron la información de la anchoveta en todo el PSO los que lograron modelar el nicho de manera correcta. Respecto al análisis por estadios, se evidenció que cada estadio de desarrollo tiene distintas tolerancias a las variables ambientales consideradas en este trabajo, siendo los nichos de estadios menos desarrollados los que estuvieron incluidos dentro de los estadios más desarrollados. Se recomienda realizar estudios separados para cada estadio de desarrollo, lo cual permita comprender mejor las relaciones ecológicas encontradas en los resultados del nicho ecológico. Además se recomienda realizar simulaciones con modelos de nicho que incluyan más variables ambientales, las cuales puedan mejorar los mapas de distribución espacial de la anchoveta para los dos enfoques de análisis.