172 resultados para Probabilistic Networks
Resumo:
Altitudinal tree lines are mainly constrained by temperature, but can also be influenced by factors such as human activity, particularly in the European Alps, where centuries of agricultural use have affected the tree-line. Over the last decades this trend has been reversed due to changing agricultural practices and land-abandonment. We aimed to combine a statistical land-abandonment model with a forest dynamics model, to take into account the combined effects of climate and human land-use on the Alpine tree-line in Switzerland. Land-abandonment probability was expressed by a logistic regression function of degree-day sum, distance from forest edge, soil stoniness, slope, proportion of employees in the secondary and tertiary sectors, proportion of commuters and proportion of full-time farms. This was implemented in the TreeMig spatio-temporal forest model. Distance from forest edge and degree-day sum vary through feed-back from the dynamics part of TreeMig and climate change scenarios, while the other variables remain constant for each grid cell over time. The new model, TreeMig-LAb, was tested on theoretical landscapes, where the variables in the land-abandonment model were varied one by one. This confirmed the strong influence of distance from forest and slope on the abandonment probability. Degree-day sum has a more complex role, with opposite influences on land-abandonment and forest growth. TreeMig-LAb was also applied to a case study area in the Upper Engadine (Swiss Alps), along with a model where abandonment probability was a constant. Two scenarios were used: natural succession only (100% probability) and a probability of abandonment based on past transition proportions in that area (2.1% per decade). The former showed new forest growing in all but the highest-altitude locations. The latter was more realistic as to numbers of newly forested cells, but their location was random and the resulting landscape heterogeneous. Using the logistic regression model gave results consistent with observed patterns of land-abandonment: existing forests expanded and gaps closed, leading to an increasingly homogeneous landscape.
Resumo:
European regulatory networks (ERNs) constitute the main governance instrument for the informal co-ordination of public regulation at the European Union (EU) level. They are in charge of co-ordinating national regulators and ensuring the implementation of harmonized regulatory policies across the EU, while also offering sector-specific expertise to the Commission. To this aim, ERNs develop 'best practices' and benchmarking procedures in the form of standards, norms and guidelines to be adopted in member states. In this paper, we focus on the Committee of European Securities Regulators and examine the consequences of the policy-making structure of ERNs on the domestic adoption of standards. We find that the regulators of countries with larger financial industries tend to occupy more central positions in the network, especially among newer member states. In turn, network centrality is associated with a more prompt domestic adoption of standards.
MetaNetX.org: a website and repository for accessing, analysing and manipulating metabolic networks.
Resumo:
SUMMARY: MetaNetX.org is a website for accessing, analysing and manipulating genome-scale metabolic networks (GSMs) as well as biochemical pathways. It consistently integrates data from various public resources and makes the data accessible in a standardized format using a common namespace. Currently, it provides access to hundreds of GSMs and pathways that can be interactively compared (two or more), analysed (e.g. detection of dead-end metabolites and reactions, flux balance analysis or simulation of reaction and gene knockouts), manipulated and exported. Users can also upload their own metabolic models, choose to automatically map them into the common namespace and subsequently make use of the website's functionality. Availability and implementation: MetaNetX.org is available at http://metanetx.org. CONTACT: help@metanetx.org.
Resumo:
Rapport de synthèse : Objectif : Les déficits cognitifs présents dans la phase aiguë d'une lésion hémisphérique focale ont tendance à être de nature plus importante et plus générale que les déficits résiduels qui persistent dans la phase chronique de récupération. Nous avons investigué, dans le cadre de ce travail, les modèles de récupération auditive et la relation qui se dessine entre les déficits et les dommages relatifs à des réseaux spécifiques, pris comme modèle cognitif des fonctions auditives. De nombreuses études humaines dans les domaines de la neuropsychologie, de la psychophysique ainsi que des études d'activation suggèrent que les processus de reconnaissance et de localisation sonores sont effectués par l'intermédiaire de réseaux distincts tant sur le plan anatomique que fonctionnel : il s'agit des zones de traitement du «What» et du «Where », qui sont toutes deux présentes dans les deux hémisphères. Des études ont démontré que des lésions hémisphériques focales gauches ou droites, centrées sur ces réseaux, sont associées dans la phase chronique de récupération à des déficits correspondant en ce qui concerne la reconnaissance et/ou la localisation sonore. Méthode : Dans le cadre de ce travail, nous avons analysé les résultats concernant les performances auditives chez 24 patients ayant subi des lésions hémisphériques focales avec déficits secondaires dans des tâches de reconnaissance, de localisation et/ou de perception du mouvement sonore lors d'un premier testing effectué en phase aiguë (9 patients), en phase subaiguë (6 patients) ou en phase chronique précoce (9 patients). La totalité de ces patients ont bénéficié d'un second testing en phase chronique. Les observations effectuées ont servi à l'élaboration de patterns de récupération auditive. Résultats : Tous les 24 patients avaient initialement un déficit dans le domaine de la localisation et/ou de la perception du mouvement sonore. Dans la phase aiguë, ce déficit survenait sans atteinte spécifique du réseau «Where » chez presque la moitié des patients ; en revanche, cette situation n'était jamais observée chez les patients testés en phase chronique précoce. Une absence de récupération avait tendance à être associée à un dommage spécifique au réseau concerné ainsi qu'à la persistance d'un déficit au-delà de la phase aiguë. Les déficits résiduels n'étaient par ailleurs pas strictement en lien avec la taille lésionnelle ou l'étendue de l'atteinte du réseau spécifique. Conclusion : Nos résultats suggèrent que des mécanismes distincts sous-tendent la récupération et la plasticité à différentes périodes temporelles post-lésionnelles.
Resumo:
The geometry and connectivity of fractures exert a strong influence on the flow and transport properties of fracture networks. We present a novel approach to stochastically generate three-dimensional discrete networks of connected fractures that are conditioned to hydrological and geophysical data. A hierarchical rejection sampling algorithm is used to draw realizations from the posterior probability density function at different conditioning levels. The method is applied to a well-studied granitic formation using data acquired within two boreholes located 6 m apart. The prior models include 27 fractures with their geometry (position and orientation) bounded by information derived from single-hole ground-penetrating radar (GPR) data acquired during saline tracer tests and optical televiewer logs. Eleven cross-hole hydraulic connections between fractures in neighboring boreholes and the order in which the tracer arrives at different fractures are used for conditioning. Furthermore, the networks are conditioned to the observed relative hydraulic importance of the different hydraulic connections by numerically simulating the flow response. Among the conditioning data considered, constraints on the relative flow contributions were the most effective in determining the variability among the network realizations. Nevertheless, we find that the posterior model space is strongly determined by the imposed prior bounds. Strong prior bounds were derived from GPR measurements and helped to make the approach computationally feasible. We analyze a set of 230 posterior realizations that reproduce all data given their uncertainties assuming the same uniform transmissivity in all fractures. The posterior models provide valuable statistics on length scales and density of connected fractures, as well as their connectivity. In an additional analysis, effective transmissivity estimates of the posterior realizations indicate a strong influence of the DFN structure, in that it induces large variations of equivalent transmissivities between realizations. The transmissivity estimates agree well with previous estimates at the site based on pumping, flowmeter and temperature data.
Resumo:
Background The 'database search problem', that is, the strengthening of a case - in terms of probative value - against an individual who is found as a result of a database search, has been approached during the last two decades with substantial mathematical analyses, accompanied by lively debate and centrally opposing conclusions. This represents a challenging obstacle in teaching but also hinders a balanced and coherent discussion of the topic within the wider scientific and legal community. This paper revisits and tracks the associated mathematical analyses in terms of Bayesian networks. Their derivation and discussion for capturing probabilistic arguments that explain the database search problem are outlined in detail. The resulting Bayesian networks offer a distinct view on the main debated issues, along with further clarity. Methods As a general framework for representing and analyzing formal arguments in probabilistic reasoning about uncertain target propositions (that is, whether or not a given individual is the source of a crime stain), this paper relies on graphical probability models, in particular, Bayesian networks. This graphical probability modeling approach is used to capture, within a single model, a series of key variables, such as the number of individuals in a database, the size of the population of potential crime stain sources, and the rarity of the corresponding analytical characteristics in a relevant population. Results This paper demonstrates the feasibility of deriving Bayesian network structures for analyzing, representing, and tracking the database search problem. The output of the proposed models can be shown to agree with existing but exclusively formulaic approaches. Conclusions The proposed Bayesian networks allow one to capture and analyze the currently most well-supported but reputedly counter-intuitive and difficult solution to the database search problem in a way that goes beyond the traditional, purely formulaic expressions. The method's graphical environment, along with its computational and probabilistic architectures, represents a rich package that offers analysts and discussants with additional modes of interaction, concise representation, and coherent communication.
Resumo:
MOTIVATION: Understanding gene regulation in biological processes and modeling the robustness of underlying regulatory networks is an important problem that is currently being addressed by computational systems biologists. Lately, there has been a renewed interest in Boolean modeling techniques for gene regulatory networks (GRNs). However, due to their deterministic nature, it is often difficult to identify whether these modeling approaches are robust to the addition of stochastic noise that is widespread in gene regulatory processes. Stochasticity in Boolean models of GRNs has been addressed relatively sparingly in the past, mainly by flipping the expression of genes between different expression levels with a predefined probability. This stochasticity in nodes (SIN) model leads to over representation of noise in GRNs and hence non-correspondence with biological observations. RESULTS: In this article, we introduce the stochasticity in functions (SIF) model for simulating stochasticity in Boolean models of GRNs. By providing biological motivation behind the use of the SIF model and applying it to the T-helper and T-cell activation networks, we show that the SIF model provides more biologically robust results than the existing SIN model of stochasticity in GRNs. AVAILABILITY: Algorithms are made available under our Boolean modeling toolbox, GenYsis. The software binaries can be downloaded from http://si2.epfl.ch/ approximately garg/genysis.html.
Resumo:
This paper presents a method based on a geographical information system (GIS) to model ecological networks in a fragmented landscape. The ecological networks are generated with the help of a landscape model (which integrate human activities) and with a wildlife dispersal model. The main results are maps which permit the analysis and the understanding of the impact of human activities on wildlife dispersal. Three applications in a study area are presented: ecological networks at the landscape scale, conflicting areas at the farmstead scale and ecological distance between biotopes. These applications show the flexibility of the model and its potential to give information on ecological networks at different planning scales.
Resumo:
Background: One characteristic of post traumatic stress disorder is an inability to adapt to a safe environment i.e. to change behavior when predictions of adverse outcomes are not met. Recent studies have also indicated that PTSD patients have altered pain processing, with hyperactivation of the putamen and insula to aversive stimuli (Geuze et al, 2007). The present study examined neuronal responses to aversive and predicted aversive events. Methods: Twenty-four trauma exposed non-PTSD controls and nineteen subjects with PTSD underwent fMRI imaging during a partial reinforcement fear conditioning paradigm, with a mild electric shock as the unconditioned stimuli (UCS). Three conditions were analyzed: actual presentations of the UCS, events when a UCS was expected, but omitted (CS+), and events when the UCS was neither expected nor delivered (CS-). Results: The UCS evoked significant alterations in the pain matrix consisting of the brainstem, the midbrain, the thalamus, the insula, the anterior and middle cingulate and the contralateral somatosensory cortex. PTSD subjects displayed bilaterally elevated putamen activity to the electric shock, as compared to controls. In trials when USC was expected, but omitted, significant activations were observed in the brainstem, the midbrain, the anterior insula and the anterior cingulate. PTSD subjects displayed similar activations, but also elevated activations in the amygdala and the posterior insula. Conclusions: These results indicate altered fear and safety learning in PTSD, and neuronal activations are further explored in terms of functional connectivity using psychophysiological interaction analyses.
Resumo:
In the forensic examination of DNA mixtures, the question of how to set the total number of contributors (N) presents a topic of ongoing interest. Part of the discussion gravitates around issues of bias, in particular when assessments of the number of contributors are not made prior to considering the genotypic configuration of potential donors. Further complication may stem from the observation that, in some cases, there may be numbers of contributors that are incompatible with the set of alleles seen in the profile of a mixed crime stain, given the genotype of a potential contributor. In such situations, procedures that take a single and fixed number contributors as their output can lead to inferential impasses. Assessing the number of contributors within a probabilistic framework can help avoiding such complication. Using elements of decision theory, this paper analyses two strategies for inference on the number of contributors. One procedure is deterministic and focuses on the minimum number of contributors required to 'explain' an observed set of alleles. The other procedure is probabilistic using Bayes' theorem and provides a probability distribution for a set of numbers of contributors, based on the set of observed alleles as well as their respective rates of occurrence. The discussion concentrates on mixed stains of varying quality (i.e., different numbers of loci for which genotyping information is available). A so-called qualitative interpretation is pursued since quantitative information such as peak area and height data are not taken into account. The competing procedures are compared using a standard scoring rule that penalizes the degree of divergence between a given agreed value for N, that is the number of contributors, and the actual value taken by N. Using only modest assumptions and a discussion with reference to a casework example, this paper reports on analyses using simulation techniques and graphical models (i.e., Bayesian networks) to point out that setting the number of contributors to a mixed crime stain in probabilistic terms is, for the conditions assumed in this study, preferable to a decision policy that uses categoric assumptions about N.