973 resultados para Network Flow Interpretation
Resumo:
Among numerous magnetic resonance imaging (MRI) techniques, perfusion MRI provides insight into the passage of blood through the brain's vascular network non-invasively. Studying disease models and transgenic mice would intrinsically help understanding the underlying brain functions, cerebrovascular disease and brain disorders. This study evaluates the feasibility of performing continuous arterial spin labeling (CASL) on all cranial arteries for mapping murine cerebral blood flow at 9.4 T. We showed that with an active-detuned two-coil system, a labeling efficiency of 0.82 ± 0.03 was achieved with minimal magnetization transfer residuals in brain. The resulting cerebral blood flow of healthy mouse was 99 ± 26 mL/100g/min, in excellent agreement with other techniques. In conclusion, high magnetic fields deliver high sensitivity and allowing not only CASL but also other MR techniques, i.e. (1)H MRS and diffusion MRI etc, in studying murine brains.
Resumo:
Deeply incised river networks are generally regarded as robust features that are not easily modified by erosion or tectonics. Although the reorganization of deeply incised drainage systems has been documented, the corresponding importance with regard to the overall landscape evolution of mountain ranges and the factors that permit such reorganizations are poorly understood. To address this problem, we have explored the rapid drainage reorganization that affected the Cahabon River in Guatemala during the Quaternary. Sediment-provenance analysis, field mapping, and electrical resistivity tomography (ERT) imaging are used to reconstruct the geometry of the valley before the river was captured. Dating of the abandoned valley sediments by the Be-10-Al-26 burial method and geomagnetic polarity analysis allow us to determine the age of the capture events and then to quantify several processes, such as the rate of tectonic deformation of the paleovalley, the rate of propagation of post-capture drainage reversal, and the rate at which canyons that formed at the capture sites have propagated along the paleovalley. Transtensional faulting started 1 to 3 million years ago, produced ground tilting and ground faulting along the Cahabon River, and thus generated differential uplift rate of 0.3 +/- 0.1 up to 0.7 +/- 0.4 mm . y(-1) along the river's course. The river responded to faulting by incising the areas of relative uplift and depositing a few tens of meters of sediment above the areas of relative subsidence. Then, the river experienced two captures and one avulsion between 700 ky and 100 ky. The captures breached high-standing ridges that separate the Cahabon River from its captors. Captures occurred at specific points where ridges are made permeable by fault damage zones and/or soluble rocks. Groundwater flow from the Cahabon River down to its captors likely increased the erosive power of the captors thus promoting focused erosion of the ridges. Valley-fill formation and capture occurred in close temporal succession, suggesting a genetic link between the two. We suggest that the aquifers accumulated within the valley-fills, increased the head along the subterraneous system connecting the Cahabon River to its captors, and promoted their development. Upon capture, the breached valley experienced widespread drainage reversal toward the capture sites. We attribute the generalized reversal to combined effects of groundwater sapping in the valley-fill, axial drainage obstruction by lateral fans, and tectonic tilting. Drainage reversal increased the size of the captured areas by a factor of 4 to 6. At the capture sites, 500 m deep canyons have been incised into the bedrock and are propagating upstream at a rate of 3 to 11 mm . y(-1) deepening at a rate of 0.7 to 1 5 mm . y(-1). At this rate, 1 to 2 million years will be necessary for headward erosion to completely erase the topographic expression of the paleovalley. It is concluded that the rapid reorganization of this drainage system was made possible by the way the river adjusted to the new tectonic strain field, which involved transient sedimentation along the river's course. If the river had escaped its early reorganization and had been given the time necessary to reach a new dynamic equilibrium, then the transient conditions that promoted capture would have vanished and its vulnerability to capture would have been strongly reduced.
Resumo:
To evaluate primary care physicians' attitude towards implementation of rotavirus (RV) immunisation into the Swiss immunisation schedule, an eight-question internet-based questionnaire was sent to the 3799 subscribers of InfoVac, a nationwide web-based expert network on immunisation issues, which reaches >95% of paediatricians and smaller proportions of other primary care physicians. Five demographic variables were also inquired. Descriptive statistics and multivariate analyses for the main outcome "acceptance of routine RV immunisation" and other variables were performed. Diffusion of innovation theory was used for data assessment. Nine-hundred seventy-seven questionnaires were returned (26%). Fifty percent of participants were paediatricians. Routine RV immunisation was supported by 146 participants (15%; so called early adopters), dismissed by 620 (64%), leaving 211 (21%) undecided. However, when asked whether they would recommend RV vaccination to parents if it were officially recommended by the federal authorities and reimbursed, 467 (48.5%; so called early majority) agreed to recommend RV immunisation. Multivariate analysis revealed that physicians who would immunise their own child (OR: 5.1; 95% CI: 4.1-6.3), hospital-based physicians (OR: 1.6; 95% CI: 1.1-2.3) and physicians from the French (OR: 1.6; 95% CI: 1.2-2.3) and Italian speaking areas of Switzerland (OR: 2.5; 95% CI: 1.1-5.8) were more likely to support RV immunisation. Diffusion of innovation theory predicts a >80% implementation if approximately 50% of a given population support an innovation. Introduction of RV immunisation in Switzerland is likely to be successful, if (i) the federal authorities issue an official recommendation and (ii) costs are covered by basic health care insurance.
Resumo:
We propose a procedure for analyzing and characterizing complex networks. We apply this to the social network as constructed from email communications within a medium sized university with about 1700 employees. Email networks provide an accurate and nonintrusive description of the flow of information within human organizations. Our results reveal the self-organization of the network into a state where the distribution of community sizes is self-similar. This suggests that a universal mechanism, responsible for emergence of scaling in other self-organized complex systems, as, for instance, river networks, could also be the underlying driving force in the formation and evolution of social networks.
Resumo:
We have used surface-based electrical resistivity tomography to detect and characterize preferential hydraulic pathways in the immediate downstream area of an abandoned, hazardous landfill. The landfill occupies the void left by a former gravel pit and its base is close to the groundwater table and lacking an engineered barrier. As such, this site is remarkably typical of many small- to medium-sized waste deposits throughout the densely populated and heavily industrialized foreland on both sides of the Alpine arc. Outflows of pollutants lastingly contaminated local drinking water supplies and necessitated a partial remediation in the form of a synthetic cover barrier, which is meant to prevent meteoric water from percolating through the waste before reaching the groundwater table. Any future additional isolation of the landfill in the form of lateral barriers thus requires adequate knowledge of potential preferential hydraulic pathways for outflowing contaminants. Our results, inferred from a suite of tomographically inverted surfaced-based electrical resistivity profiles oriented roughly perpendicular to the local hydraulic gradient, indicate that potential contaminant outflows would predominantly occur along an unexploited lateral extension of the original gravel deposit. This finds its expression as a distinct and laterally continuous high-resistivity anomaly in the resistivity tomograms. This interpretation is ground-truthed through a litholog from a nearby well. Since the probed glacio-fluvial deposits are largely devoid of mineralogical clay, the geometry of hydraulic and electrical pathways across the pore space of a given lithological unit can be assumed to be identical, which allows for an order-of-magnitude estimation of the overall permeability structure. These estimates indicate that the permeability of the imaged extension of the gravel body is at least two to three orders-of-magnitude higher than that of its finer-grained embedding matrix. This corroborates the preeminent role of the high-resistivity anomaly as a potential preferential flow path.
Resumo:
Debris flows are among the most dangerous processes in mountainous areas due to their rapid rate of movement and long runout zone. Sudden and rather unexpected impacts produce not only damages to buildings and infrastructure but also threaten human lives. Medium- to regional-scale susceptibility analyses allow the identification of the most endangered areas and suggest where further detailed studies have to be carried out. Since data availability for larger regions is mostly the key limiting factor, empirical models with low data requirements are suitable for first overviews. In this study a susceptibility analysis was carried out for the Barcelonnette Basin, situated in the southern French Alps. By means of a methodology based on empirical rules for source identification and the empirical angle of reach concept for the 2-D runout computation, a worst-case scenario was first modelled. In a second step, scenarios for high, medium and low frequency events were developed. A comparison with the footprints of a few mapped events indicates reasonable results but suggests a high dependency on the quality of the digital elevation model. This fact emphasises the need for a careful interpretation of the results while remaining conscious of the inherent assumptions of the model used and quality of the input data.
Resumo:
How have changes in communications technology affected the way that misinformation spreads through a population and persists? To what extent do differences in the architecture of social networks affect the spread of misinformation, relative to the rates and rules by which individuals transmit or eliminate different pieces of information (cultural traits)? Here, we use analytical models and individual-based simulations to study how a 'cultural load' of misinformation can be maintained in a population under a balance between social transmission and selective elimination of cultural traits with low intrinsic value. While considerable research has explored how network architecture affects percolation processes, we find that the relative rates at which individuals transmit or eliminate traits can have much more profound impacts on the cultural load than differences in network architecture. In particular, the cultural load is insensitive to correlations between an individual's network degree and rate of elimination when these quantities vary among individuals. Taken together, these results suggest that changes in communications technology may have influenced cultural evolution more strongly through changes in the amount of information flow, rather than the details of who is connected to whom.
Resumo:
Introduction: The field of Connectomic research is growing rapidly, resulting from methodological advances in structural neuroimaging on many spatial scales. Especially progress in Diffusion MRI data acquisition and processing made available macroscopic structural connectivity maps in vivo through Connectome Mapping Pipelines (Hagmann et al, 2008) into so-called Connectomes (Hagmann 2005, Sporns et al, 2005). They exhibit both spatial and topological information that constrain functional imaging studies and are relevant in their interpretation. The need for a special-purpose software tool for both clinical researchers and neuroscientists to support investigations of such connectome data has grown. Methods: We developed the ConnectomeViewer, a powerful, extensible software tool for visualization and analysis in connectomic research. It uses the novel defined container-like Connectome File Format, specifying networks (GraphML), surfaces (Gifti), volumes (Nifti), track data (TrackVis) and metadata. Usage of Python as programming language allows it to by cross-platform and have access to a multitude of scientific libraries. Results: Using a flexible plugin architecture, it is possible to enhance functionality for specific purposes easily. Following features are already implemented: * Ready usage of libraries, e.g. for complex network analysis (NetworkX) and data plotting (Matplotlib). More brain connectivity measures will be implemented in a future release (Rubinov et al, 2009). * 3D View of networks with node positioning based on corresponding ROI surface patch. Other layouts possible. * Picking functionality to select nodes, select edges, get more node information (ConnectomeWiki), toggle surface representations * Interactive thresholding and modality selection of edge properties using filters * Arbitrary metadata can be stored for networks, thereby allowing e.g. group-based analysis or meta-analysis. * Python Shell for scripting. Application data is exposed and can be modified or used for further post-processing. * Visualization pipelines using filters and modules can be composed with Mayavi (Ramachandran et al, 2008). * Interface to TrackVis to visualize track data. Selected nodes are converted to ROIs for fiber filtering The Connectome Mapping Pipeline (Hagmann et al, 2008) processed 20 healthy subjects into an average Connectome dataset. The Figures show the ConnectomeViewer user interface using this dataset. Connections are shown that occur in all 20 subjects. The dataset is freely available from the homepage (connectomeviewer.org). Conclusions: The ConnectomeViewer is a cross-platform, open-source software tool that provides extensive visualization and analysis capabilities for connectomic research. It has a modular architecture, integrates relevant datatypes and is completely scriptable. Visit www.connectomics.org to get involved as user or developer.
Resumo:
Both, Bayesian networks and probabilistic evaluation are gaining more and more widespread use within many professional branches, including forensic science. Notwithstanding, they constitute subtle topics with definitional details that require careful study. While many sophisticated developments of probabilistic approaches to evaluation of forensic findings may readily be found in published literature, there remains a gap with respect to writings that focus on foundational aspects and on how these may be acquired by interested scientists new to these topics. This paper takes this as a starting point to report on the learning about Bayesian networks for likelihood ratio based, probabilistic inference procedures in a class of master students in forensic science. The presentation uses an example that relies on a casework scenario drawn from published literature, involving a questioned signature. A complicating aspect of that case study - proposed to students in a teaching scenario - is due to the need of considering multiple competing propositions, which is an outset that may not readily be approached within a likelihood ratio based framework without drawing attention to some additional technical details. Using generic Bayesian networks fragments from existing literature on the topic, course participants were able to track the probabilistic underpinnings of the proposed scenario correctly both in terms of likelihood ratios and of posterior probabilities. In addition, further study of the example by students allowed them to derive an alternative Bayesian network structure with a computational output that is equivalent to existing probabilistic solutions. This practical experience underlines the potential of Bayesian networks to support and clarify foundational principles of probabilistic procedures for forensic evaluation.
Resumo:
BACKGROUND: This study validates the use of phycoerythrin (PE) and allophycocyanin (APC) for fluorescence energy transfer (FRET) analyzed by flow cytometry. METHODS: FRET was detected when a pair of antibody conjugates directed against two noncompetitive epitopes on the same CD8alpha chain was used. FRET was also detected between antibody conjugate pairs specific for the two chains of the heterodimeric alpha (4)beta(1) integrin. Similarly, the association of T-cell receptor (TCR) with a soluble antigen ligand was detected by FRET when anti-TCR antibody and MHC class I/peptide complexes (<<tetramers>>) were used. RESULTS: FRET efficiency was always less than 10%, probably because of steric effects associated with the size and structure of PE and APC. Some suggestions are given to take into account this and other effects (e.g., donor and acceptor concentrations) for a better interpretation of FRET results obtained with this pair of fluorochromes. CONCLUSIONS: We conclude that FRET assays can be carried out easily with commercially available antibodies and flow cytometers to study arrays of multimolecular complexes.
Resumo:
We propose a procedure for analyzing and characterizing complex networks. We apply this to the social network as constructed from email communications within a medium sized university with about 1700 employees. Email networks provide an accurate and nonintrusive description of the flow of information within human organizations. Our results reveal the self-organization of the network into a state where the distribution of community sizes is self-similar. This suggests that a universal mechanism, responsible for emergence of scaling in other self-organized complex systems, as, for instance, river networks, could also be the underlying driving force in the formation and evolution of social networks.
Resumo:
A geophysical and geochemical study has been conducted in a fractured carbonate aquifer located at Combioula in the southwestern Swiss Alps with the objective to detect and characterize hydraulically active fractures along a 260-m-deep borehole. Hydrochemical analyses, borehole diameter, temperature and fluid electrical conductivity logging data were integrated in order to relate electrokinetic self-potential signals to groundwater flow inside the fracture network. The results show a generally good, albeit locally variable correlation of variations of the self-potential signals with variations in temperature, fluid electrical conductivity and borehole diameter. Together with the hydrochemical evidence, which was found to be critical for the interpretation of the self-potential data, these measurements not only made it possible to detect the hydraulically active fractures but also to characterize them as zones of fluid gain or fluid loss. The results complement the available information from the corresponding litholog and illustrate the potential of electrokinetic self-potential signals in conjunction with temperature, fluid electrical conductivity and hydrochemical analyses for the characterization of fractured aquifers, and thus may offer a perspective for an effective quantitative characterization of this increasingly important class of aquifers and geothermal reservoirs.
Resumo:
BACKGROUND: The diagnosis of malignant hematologic diseases has become increasingly complex during the last decade. It is based on the interpretation of results from different laboratory analyses, which range from microscopy to gene expression profiling. Recently, a method for the analysis of RNA phenotypes has been developed, the nCounter technology (Nanostring® Technologies), which allows for simultaneous quantification of hundreds of RNA molecules in biological samples. We evaluated this technique in a Swiss multi-center study on eighty-six samples from acute leukemia patients. METHODS: mRNA and protein profiles were established for normal peripheral blood and bone marrow samples. Signal intensities of the various tested antigens with surface expression were similar to those found in previously performed Affymetrix microarray analyses. Acute leukemia samples were analyzed for a set of twenty-two validated antigens and the Pearson Correlation Coefficient for nCounter and flow cytometry results was calculated. RESULTS: Highly significant values between 0.40 and 0.97 were found for the twenty-two antigens tested. A second correlation analysis performed on a per sample basis resulted in concordant results between flow cytometry and nCounter in 44-100% of the antigens tested (mean = 76%), depending on the number of blasts present in a sample, the homogeneity of the blast population, and the type of leukemia (AML or ALL). CONCLUSIONS: The nCounter technology allows for fast and easy depiction of a mRNA profile from hematologic samples. This technology has the potential to become a valuable tool for the diagnosis of acute leukemias, in addition to multi-color flow cytometry.
Resumo:
Cell death is essential for a plethora of physiological processes, and its deregulation characterizes numerous human diseases. Thus, the in-depth investigation of cell death and its mechanisms constitutes a formidable challenge for fundamental and applied biomedical research, and has tremendous implications for the development of novel therapeutic strategies. It is, therefore, of utmost importance to standardize the experimental procedures that identify dying and dead cells in cell cultures and/or in tissues, from model organisms and/or humans, in healthy and/or pathological scenarios. Thus far, dozens of methods have been proposed to quantify cell death-related parameters. However, no guidelines exist regarding their use and interpretation, and nobody has thoroughly annotated the experimental settings for which each of these techniques is most appropriate. Here, we provide a nonexhaustive comparison of methods to detect cell death with apoptotic or nonapoptotic morphologies, their advantages and pitfalls. These guidelines are intended for investigators who study cell death, as well as for reviewers who need to constructively critique scientific reports that deal with cellular demise. Given the difficulties in determining the exact number of cells that have passed the point-of-no-return of the signaling cascades leading to cell death, we emphasize the importance of performing multiple, methodologically unrelated assays to quantify dying and dead cells.