49 resultados para Analysis Tools

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Primate multisensory object perception involves distributed brain regions. To investigate the network character of these regions of the human brain, we applied data-driven group spatial independent component analysis (ICA) to a functional magnetic resonance imaging (fMRI) data set acquired during a passive audio-visual (AV) experiment with common object stimuli. We labeled three group-level independent component (IC) maps as auditory (A), visual (V), and AV, based on their spatial layouts and activation time courses. The overlap between these IC maps served as definition of a distributed network of multisensory candidate regions including superior temporal, ventral occipito-temporal, posterior parietal and prefrontal regions. During an independent second fMRI experiment, we explicitly tested their involvement in AV integration. Activations in nine out of these twelve regions met the max-criterion (A < AV > V) for multisensory integration. Comparison of this approach with a general linear model-based region-of-interest definition revealed its complementary value for multisensory neuroimaging. In conclusion, we estimated functional networks of uni- and multisensory functional connectivity from one dataset and validated their functional roles in an independent dataset. These findings demonstrate the particular value of ICA for multisensory neuroimaging research and using independent datasets to test hypotheses generated from a data-driven analysis.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Moose is a powerful reverse engineering platform, but its facilities and means to analyze software are separated from the tools developers typically use to develop and maintain their software systems: development environments such as Eclipse, VisualWorks, or Squeak. In practice, this requires developers to work with two distinct environments, one to actually develop the software, and another one (e.g., Moose) to analyze it. We worked on several different techniques, using both dynamic and static analyzes to provide software analysis capabilities to developers directly in the IDE. The immediate availability of analysis tools in an IDE significantly increases the likelihood that developers integrate software analysis in their daily work, as we discovered by conducting user studies with developers. Finally, we identified several important aspect of integrating software analysis in IDEs that need to be addressed in the future to increase the adoption of these techniques by developers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Brian electric activity is viewed as sequences of momentary maps of potential distribution. Frequency-domain source modeling, estimation of the complexity of the trajectory of the mapped brain field distributions in state space, and microstate parsing were used as analysis tools. Input-presentation as well as task-free (spontaneous thought) data collection paradigms were employed. We found: Alpha EEG field strength is more affected by visualizing mentation than by abstract mentation, both input-driven as well as self-generated. There are different neuronal populations and brain locations of the electric generators for different temporal frequencies of the brain field. Different alpha frequencies execute different brain functions as revealed by canonical correlations with mentation profiles. Different modes of mentation engage the same temporal frequencies at different brain locations. The basic structure of alpha electric fields implies inhomogeneity over time — alpha consists of concatenated global microstates in the sub-second range, characterized by quasi-stable field topographies, and rapid transitions between the microstates. In general, brain activity is strongly discontinuous, indicating that parsing into field landscape-defined microstates is appropriate. Different modes of spontaneous and induced mentation are associated with different brain electric microstates; these are proposed as candidates for psychophysiological ``atoms of thought''.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

This article looks at the negotiations between Switzerland and Germany on air traffic regulation with the help of negotiation analysis tools. A number of factors pre-eminent in the literature on negotiation processes and outcomes are presented and critically assessed. In particular arguments of “power”, which are often insufficiently explored in analysing interstate cooperation, are brought back into the picture. The article argues that structural power best explains the negotiation results while domestic politics and information asymmetries both account for non-ratification of the treaty. Institutionalist arguments on the constraining effects of international norms and institutions as well as explanations focusing on negotiation skills are of minor importance. Moreover, the nature of the Swiss intra-governmental setting at the federal level did not encourage the Swiss negotiators to exploit all means during the different stages of the bargaining process. The article concludes by illuminating a number of policy observations in the broader context of Swiss foreign relations and indicating avenues for further research

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The paper showcases the field- and lab-documentation system developed for Kinneret Regional Project, an international archaeological expedition to the Northwestern shore of the Sea of Galilee (Israel) under the auspices of the University of Bern, the University of Helsinki, Leiden University and Wofford College. The core of the data management system is a fully relational, server-based database framework, which also includes time-based and static GIS services, stratigraphic analysis tools and fully indexed document/digital image archives. Data collection in the field is based on mobile, hand-held devices equipped with a custom-tailored stand-alone application. Comprehensive three-dimensional documentation of all finds and findings is achieved by means of total stations and/or high-precision GPS devices. All archaeological information retrieved in the field – including tachymetric data – is synched with the core system on the fly and thus immediately available for further processing in the field lab (within the local network) or for post-excavation analysis at remote institutions (via the WWW). Besides a short demonstration of the main functionalities, the paper also presents some of the key technologies used and illustrates usability aspects of the system’s individual components.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Stray light contamination reduces considerably the precision of photometric of faint stars for low altitude spaceborne observatories. When measuring faint objects, the necessity of coping with stray light contamination arises in order to avoid systematic impacts on low signal-to-noise images. Stray light contamination can be represented by a flat offset in CCD data. Mitigation techniques begin by a comprehensive study during the design phase, followed by the use of target pointing optimisation and post-processing methods. We present a code that aims at simulating the stray-light contamination in low-Earth orbit coming from reflexion of solar light by the Earth. StrAy Light SimulAtor (SALSA) is a tool intended to be used at an early stage as a tool to evaluate the effective visible region in the sky and, therefore to optimise the observation sequence. SALSA can compute Earth stray light contamination for significant periods of time allowing missionwide parameters to be optimised (e.g. impose constraints on the point source transmission function (PST) and/or on the altitude of the satellite). It can also be used to study the behaviour of the stray light at different seasons or latitudes. Given the position of the satellite with respect to the Earth and the Sun, SALSA computes the stray light at the entrance of the telescope following a geometrical technique. After characterising the illuminated region of the Earth, the portion of illuminated Earth that affects the satellite is calculated. Then, the flux of reflected solar photons is evaluated at the entrance of the telescope. Using the PST of the instrument, the final stray light contamination at the detector is calculated. The analysis tools include time series analysis of the contamination, evaluation of the sky coverage and an objects visibility predictor. Effects of the South Atlantic Anomaly and of any shutdown periods of the instrument can be added. Several designs or mission concepts can be easily tested and compared. The code is not thought as a stand-alone mission designer. Its mandatory inputs are a time series describing the trajectory of the satellite and the characteristics of the instrument. This software suite has been applied to the design and analysis of CHEOPS (CHaracterizing ExOPlanet Satellite). This mission requires very high precision photometry to detect very shallow transits of exoplanets. Different altitudes and characteristics of the detector have been studied in order to find the best parameters, that reduce the effect of contamination. © (2014) COPYRIGHT Society of Photo-Optical Instrumentation Engineers (SPIE). Downloading of the abstract is permitted for personal use only.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Architectural decisions can be interpreted as structural and behavioral constraints that must be enforced in order to guarantee overarching qualities in a system. Enforcing those constraints in a fully automated way is often challenging and not well supported by current tools. Current approaches for checking architecture conformance either lack in usability or offer poor options for adaptation. To overcome this problem we analyze the current state of practice and propose an approach based on an extensible, declarative and empirically-grounded specification language. This solution aims at reducing the overall cost of setting up and maintaining an architectural conformance monitoring environment by decoupling the conceptual representation of a user-defined rule from its technical specification prescribed by the underlying analysis tools. By using a declarative language, we are able to write tool-agnostic rules that are simple enough to be understood by untrained stakeholders and, at the same time, can be can be automatically processed by a conformance checking validator. Besides addressing the issue of cost, we also investigate opportunities for increasing the value of conformance checking results by assisting the user towards the full alignment of the implementation with respect to its architecture. In particular, we show the benefits of providing actionable results by introducing a technique which automatically selects the optimal repairing solutions by means of simulation and profit-based quantification. We perform various case studies to show how our approach can be successfully adopted to support truly diverse industrial projects. We also investigate the dynamics involved in choosing and adopting a new automated conformance checking solution within an industrial context. Our approach reduces the cost of conformance checking by avoiding the need for an explicit management of the involved validation tools. The user can define rules using a convenient high-level DSL which automatically adapts to emerging analysis requirements. Increased usability and modular customization ensure lower costs and a shorter feedback loop.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Background Falls of elderly people may cause permanent disability or death. Particularly susceptible are elderly patients in rehabilitation hospitals. We systematically reviewed the literature to identify falls prediction tools available for assessing elderly inpatients in rehabilitation hospitals. Methods and Findings We searched six electronic databases using comprehensive search strategies developed for each database. Estimates of sensitivity and specificity were plotted in ROC space graphs and pooled across studies. Our search identified three studies which assessed the prediction properties of falls prediction tools in a total of 754 elderly inpatients in rehabilitation hospitals. Only the STRATIFY tool was assessed in all three studies; the other identified tools (PJC-FRAT and DOWNTON) were assessed by a single study. For a STRATIFY cut-score of two, pooled sensitivity was 73% (95%CI 63 to 81%) and pooled specificity was 42% (95%CI 34 to 51%). An indirect comparison of the tools across studies indicated that the DOWNTON tool has the highest sensitivity (92%), while the PJC-FRAT offers the best balance between sensitivity and specificity (73% and 75%, respectively). All studies presented major methodological limitations. Conclusions We did not identify any tool which had an optimal balance between sensitivity and specificity, or which were clearly better than a simple clinical judgment of risk of falling. The limited number of identified studies with major methodological limitations impairs sound conclusions on the usefulness of falls risk prediction tools in geriatric rehabilitation hospitals.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

During the last 10 years several molecular markers have been established as useful tools among the armamentarium of a hematologist. As a consequence, the number of performed hematologic molecular analyses has immensely increased. Often, such tests replace or complement other laboratory methods. Molecular markers can be useful in many ways: they can serve for diagnostics, describe the prognostic profile, predict which types of drugs are indicated, and can be used for the therapeutic monitoring of the patient to indicate an adequate response or predict resistance or relapse of the disease. Many markers fulfill more than one of these aspects. Most important, however, is the right choice of analyses at the right time-points!

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Advances in novel molecular biological diagnostic methods are changing the way of diagnosis and study of metabolic disorders like growth hormone deficiency. Faster sequencing and genotyping methods require strong bioinformatics tools to make sense of the vast amount of data generated by modern laboratories. Advances in genome sequencing and computational power to analyze the whole genome sequences will guide the diagnostics of future. In this chapter, an overview of some basic bioinformatics resources that are needed to study metabolic disorders are reviewed and some examples of bioinformatics analysis of human growth hormone gene, protein and structure are provided.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of drug hypersensitivity, our group has recently proposed a new model based on the structural features of drugs (pharmacological interaction with immune receptors; p-i concept) to explain their recognition by T cells. According to this concept, even chemically inert drugs can stimulate T cells because certain drugs interact in a direct way with T-cell receptors (TCR) and possibly major histocompatibility complex molecules without the need for metabolism and covalent binding to a carrier. In this study, we investigated whether mouse T-cell hybridomas transfected with drug-specific human TCR can be used as an alternative to drug-specific T-cell clones (TCC). Indeed, they behaved like TCC and, in accordance with the p-i concept, the TCR recognize their specific drugs in a direct, processing-independent, and dose-dependent way. The presence of antigen-presenting cells was a prerequisite for interleukin-2 production by the TCR-transfected cells. The analysis of cross-reactivity confirmed the fine specificity of the TCR and also showed that TCR transfectants might provide a tool to evaluate the potential of new drugs to cause hypersensitivity due to cross-reactivity. Recombining the alpha- and beta-chains of sulfanilamide- and quinolone-specific TCR abrogated drug reactivity, suggesting that both original alpha- and beta-chains were involved in drug binding. The TCR-transfected hybridoma system showed that the recognition of two important classes of drugs (sulfanilamides and quinolones) by TCR occurred according to the p-i concept and provides an interesting tool to study drug-TCR interactions and their biological consequences and to evaluate the cross-reactivity potential of new drugs of the same class.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acute lung injury is associated with a variety of histopathological alterations, such as oedema formation, damage to the components of the blood–air barrier and impairment of the surfactant system. Stereological methods are indispensable tools with which to properly quantitate these structural alterations at the light and electron microscopic level. The stereological parameters that are relevant for the analysis of acute lung injury are reviewed in the present articl

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The examination of traffic accidents is daily routine in forensic medicine. An important question in the analysis of the victims of traffic accidents, for example in collisions between motor vehicles and pedestrians or cyclists, is the situation of the impact. Apart from forensic medical examinations (external examination and autopsy), three-dimensional technologies and methods are gaining importance in forensic investigations. Besides the post-mortem multi-slice computed tomography (MSCT) and magnetic resonance imaging (MRI) for the documentation and analysis of internal findings, highly precise 3D surface scanning is employed for the documentation of the external body findings and of injury-inflicting instruments. The correlation of injuries of the body to the injury-inflicting object and the accident mechanism are of great importance. The applied methods include documentation of the external and internal body and the involved vehicles and inflicting tools as well as the analysis of the acquired data. The body surface and the accident vehicles with their damages were digitized by 3D surface scanning. For the internal findings of the body, post-mortem MSCT and MRI were used. The analysis included the processing of the obtained data to 3D models, determination of the driving direction of the vehicle, correlation of injuries to the vehicle damages, geometric determination of the impact situation and evaluation of further findings of the accident. In the following article, the benefits of the 3D documentation and computer-assisted, drawn-to-scale 3D comparisons of the relevant injuries with the damages to the vehicle in the analysis of the course of accidents, especially with regard to the impact situation, are shown on two examined cases.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Global transcriptomic and proteomic profiling platforms have yielded important insights into the complex response to ionizing radiation (IR). Nonetheless, little is known about the ways in which small cellular metabolite concentrations change in response to IR. Here, a metabolomics approach using ultraperformance liquid chromatography coupled with electrospray time-of-flight mass spectrometry was used to profile, over time, the hydrophilic metabolome of TK6 cells exposed to IR doses ranging from 0.5 to 8.0 Gy. Multivariate data analysis of the positive ions revealed dose- and time-dependent clustering of the irradiated cells and identified certain constituents of the water-soluble metabolome as being significantly depleted as early as 1 h after IR. Tandem mass spectrometry was used to confirm metabolite identity. Many of the depleted metabolites are associated with oxidative stress and DNA repair pathways. Included are reduced glutathione, adenosine monophosphate, nicotinamide adenine dinucleotide, and spermine. Similar measurements were performed with a transformed fibroblast cell line, BJ, and it was found that a subset of the identified TK6 metabolites were effective in IR dose discrimination. The GEDI (Gene Expression Dynamics Inspector) algorithm, which is based on self-organizing maps, was used to visualize dynamic global changes in the TK6 metabolome that resulted from IR. It revealed dose-dependent clustering of ions sharing the same trends in concentration change across radiation doses. "Radiation metabolomics," the application of metabolomic analysis to the field of radiobiology, promises to increase our understanding of cellular responses to stressors such as radiation.