983 resultados para Conventional matching networks
Resumo:
The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.
Resumo:
Sampling issues represent a topic of ongoing interest to the forensic science community essentially because of their crucial role in laboratory planning and working protocols. For this purpose, forensic literature described thorough (Bayesian) probabilistic sampling approaches. These are now widely implemented in practice. They allow, for instance, to obtain probability statements that parameters of interest (e.g., the proportion of a seizure of items that present particular features, such as an illegal substance) satisfy particular criteria (e.g., a threshold or an otherwise limiting value). Currently, there are many approaches that allow one to derive probability statements relating to a population proportion, but questions on how a forensic decision maker - typically a client of a forensic examination or a scientist acting on behalf of a client - ought actually to decide about a proportion or a sample size, remained largely unexplored to date. The research presented here intends to address methodology from decision theory that may help to cope usefully with the wide range of sampling issues typically encountered in forensic science applications. The procedures explored in this paper enable scientists to address a variety of concepts such as the (net) value of sample information, the (expected) value of sample information or the (expected) decision loss. All of these aspects directly relate to questions that are regularly encountered in casework. Besides probability theory and Bayesian inference, the proposed approach requires some additional elements from decision theory that may increase the efforts needed for practical implementation. In view of this challenge, the present paper will emphasise the merits of graphical modelling concepts, such as decision trees and Bayesian decision networks. These can support forensic scientists in applying the methodology in practice. How this may be achieved is illustrated with several examples. The graphical devices invoked here also serve the purpose of supporting the discussion of the similarities, differences and complementary aspects of existing Bayesian probabilistic sampling criteria and the decision-theoretic approach proposed throughout this paper.
Resumo:
Bone-mounted robotic guidance for pedicle screw placement has been recently introduced, aiming at increasing accuracy. The aim of this prospective study was to compare this novel approach with the conventional fluoroscopy assisted freehand technique (not the two- or three-dimensional fluoroscopy-based navigation). Two groups were compared: 11 patients, constituting the robotical group, were instrumented with 64 pedicle screws; 23 other patients, constituting the fluoroscopic group, were also instrumented with 64 pedicle screws. Screw position was assessed by two independent observers on postoperative CT-scans using the Rampersaud A to D classification. No neurological complications were noted. Grade A (totally within pedicle margins) accounted for 79% of the screws in the robotically assisted and for 83% of the screws in the fluoroscopic group respectively (p = 0.8). Grade C and D screws, considered as misplacements, accounted for 4.7% of all robotically inserted screws and 7.8% of the fluoroscopically inserted screws (p = 0.71). The current study did not allow to state that robotically assisted screw placement supersedes the conventional fluoroscopy assisted technique, although the literature is more optimistic about the former.
Resumo:
Current parallel applications running on clusters require the use of an interconnection network to perform communications among all computing nodes available. Imbalance of communications can produce network congestion, reducing throughput and increasing latency, degrading the overall system performance. On the other hand, parallel applications running on these networks posses representative stages which allow their characterization, as well as repetitive behavior that can be identified on the basis of this characterization. This work presents the Predictive and Distributed Routing Balancing (PR-DRB), a new method developed to gradually control network congestion, based on paths expansion, traffic distribution and effective traffic load, in order to maintain low latency values. PR-DRB monitors messages latencies on intermediate routers, makes decisions about alternative paths and record communication pattern information encountered during congestion situation. Based on the concept of applications repetitiveness, best solution recorded are reapplied when saved communication pattern re-appears. Traffic congestion experiments were conducted in order to evaluate the performance of the method, and improvements were observed.
Resumo:
Patient adherence is often poor for hypertension and dyslipidaemia. A monitoring of drug adherence might improve these risk factors control, but little is known in ambulatory care. We conducted a randomised controlled study in networks of community-based pharmacists and physicians in the canton of Fribourg to examine whether monitoring drug adherence with an electronic monitor (MEMS) would improve risk factor control among treated, but uncontrolled hypertensive and dyslipidemic patients. The results indicate that MEMS achieve a better blood pressure control and lipid profile, although its implementation requires considerable resources. The study also shows the value of collaboration between physicians and pharmacists in the field of patient adherence to improve ambulatory care of patients with cardiovascular risk factors.
Resumo:
PURPOSE: Postmortem computed tomography angiography (PMCTA) was introduced into forensic investigations a few years ago. It provides reliable images that can be consulted at any time. Conventional autopsy remains the reference standard for defining the cause of death, but provides only limited possibility of a second examination. This study compares these two procedures and discusses findings that can be detected exclusively using each method. MATERIALS AND METHODS: This retrospective study compared radiological reports from PMCTA to reports from conventional autopsy for 50 forensic autopsy cases. Reported findings from autopsy and PMCTA were extracted and compared to each other. PMCTA was performed using a modified heart-lung machine and the oily contrast agent Angiofil® (Fumedica AG, Muri, Switzerland). RESULTS: PMCTA and conventional autopsy would have drawn similar conclusions regarding causes of death. Nearly 60 % of all findings were visualized with both techniques. PMCTA demonstrates a higher sensitivity for identifying skeletal and vascular lesions. However, vascular occlusions due to postmortem blood clots could be falsely assumed to be vascular lesions. In contrast, conventional autopsy does not detect all bone fractures or the exact source of bleeding. Conventional autopsy provides important information about organ morphology and remains the only way to diagnose a vital vascular occlusion with certitude. CONCLUSION: Overall, PMCTA and conventional autopsy provide comparable findings. However, each technique presents advantages and disadvantages for detecting specific findings. To correctly interpret findings and clearly define the indications for PMCTA, these differences must be understood.
Resumo:
Mycolic acids analysis by thin-layer chromatography (TLC) has been employed by several laboratories worldwide as a method for fast identification of mycobacteria. This method was introduced in Brazil by our laboratory in 1992 as a routine identification technique. Up to the present, 861 strains isolated were identified by mycolic acids TLC and by standard biochemical tests; 61% out of these strains came as clinical samples, 4% isolated from frogs and 35% as environmental samples. Mycobacterium tuberculosis strains identified by classical methods were confirmed by their mycolic acids contents (I, III and IV). The method allowed earlier differentiation of M. avium complex - MAC (mycolic acids I, IV and VI) from M. simiae (acids I, II and IV), both with similar biochemical properties. The method also permitted to distinguish M. fortuitum (acids I and V) from M. chelonae (acids I and II) , and to detect mixed mycobacterial infections cases as M. tuberculosis with MAC and M. fortuitum with MAC. Concluding, four years experience shows that mycolic acids TLC is an easy, reliable, fast and inexpensive method, an important tool to put together conventional mycobacteria identification methods.
Resumo:
Human imaging studies examining fear conditioning have mainly focused on the neural responses to conditioned cues. In contrast, the neural basis of the unconditioned response and the mechanisms by which fear modulates inter-regional functional coupling have received limited attention. We examined the neural responses to an unconditioned stimulus using a partial-reinforcement fear conditioning paradigm and functional MRI. The analysis focused on: (1) the effects of an unconditioned stimulus (an electric shock) that was either expected and actually delivered, or expected but not delivered, and (2) on how related brain activity changed across conditioning trials, and (3) how shock expectation influenced inter-regional coupling within the fear network. We found that: (1) the delivery of the shock engaged the red nucleus, amygdale, dorsal striatum, insula, somatosensory and cingulate cortices, (2) when the shock was expected but not delivered, only the red nucleus, the anterior insular and dorsal anterior cingulate cortices showed activity increases that were sustained across trials, and (3) psycho-physiological interaction analysis demonstrated that fear led to increased red nucleus coupling to insula but decreased hippocampus coupling to the red nucleus, thalamus and cerebellum. The hippocampus and the anterior insula may serve as hubs facilitating the switch between engagement of a defensive immediate fear network and a resting network.