990 resultados para Interpretative structural modeling


Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIMS: Patients with well-tolerated sustained monomorphic ventricular tachycardia (SMVT) and left ventricular ejection fraction (LVEF) over 30% may benefit from a primary strategy of VT ablation without immediate need for a 'back-up' implantable cardioverter-defibrillator (ICD). METHODS AND RESULTS: One hundred and sixty-six patients with structural heart disease (SHD), LVEF over 30%, and well-tolerated SMVT (no syncope) underwent primary radiofrequency ablation without ICD implantation at eight European centres. There were 139 men (84%) with mean age 62 ± 15 years and mean LVEF of 50 ± 10%. Fifty-five percent had ischaemic heart disease, 19% non-ischaemic cardiomyopathy, and 12% arrhythmogenic right ventricular cardiomyopathy. Three hundred seventy-eight similar patients were implanted with an ICD during the same period and serve as a control group. All-cause mortality was 12% (20 patients) over a mean follow-up of 32 ± 27 months. Eight patients (40%) died from non-cardiovascular causes, 8 (40%) died from non-arrhythmic cardiovascular causes, and 4 (20%) died suddenly (SD) (2.4% of the population). All-cause mortality in the control group was 12%. Twenty-seven patients (16%) had a non-fatal recurrence at a median time of 5 months, while 20 patients (12%) required an ICD, of whom 4 died (20%). CONCLUSION: Patients with well-tolerated SMVT, SHD, and LVEF > 30% undergoing primary VT ablation without a back-up ICD had a very low rate of arrhythmic death and recurrences were generally non-fatal. These data would support a randomized clinical trial comparing this approach with others incorporating implantation of an ICD as a primary strategy.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

1. Species distribution modelling is used increasingly in both applied and theoretical research to predict how species are distributed and to understand attributes of species' environmental requirements. In species distribution modelling, various statistical methods are used that combine species occurrence data with environmental spatial data layers to predict the suitability of any site for that species. While the number of data sharing initiatives involving species' occurrences in the scientific community has increased dramatically over the past few years, various data quality and methodological concerns related to using these data for species distribution modelling have not been addressed adequately. 2. We evaluated how uncertainty in georeferences and associated locational error in occurrences influence species distribution modelling using two treatments: (1) a control treatment where models were calibrated with original, accurate data and (2) an error treatment where data were first degraded spatially to simulate locational error. To incorporate error into the coordinates, we moved each coordinate with a random number drawn from the normal distribution with a mean of zero and a standard deviation of 5 km. We evaluated the influence of error on the performance of 10 commonly used distributional modelling techniques applied to 40 species in four distinct geographical regions. 3. Locational error in occurrences reduced model performance in three of these regions; relatively accurate predictions of species distributions were possible for most species, even with degraded occurrences. Two species distribution modelling techniques, boosted regression trees and maximum entropy, were the best performing models in the face of locational errors. The results obtained with boosted regression trees were only slightly degraded by errors in location, and the results obtained with the maximum entropy approach were not affected by such errors. 4. Synthesis and applications. To use the vast array of occurrence data that exists currently for research and management relating to the geographical ranges of species, modellers need to know the influence of locational error on model quality and whether some modelling techniques are particularly robust to error. We show that certain modelling techniques are particularly robust to a moderate level of locational error and that useful predictions of species distributions can be made even when occurrence data include some error.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The functionality of adult neocortical circuits can be altered by novel experiences or learning. This functional plasticity appears to rely on changes in the strength of neuronal connections that were established during development. Here we will describe some of our studies in which we have addressed whether structural changes, including the remodeling of axons and dendrites with synapse formation and elimination, could underlie experience-dependent plasticity in the adult neocortex. Using 2-photon laser-scanning microscopes and transgenic mice expressing GFP in a subset of pyramidal cells, we have observed that a small subset of dendritic spines continuously appear and disappear on a daily basis, whereas the majority of spines persists for months. Axonal boutons from different neuronal classes displayed similar behavior, although the extent of remodeling varied. Under baseline conditions, new spines in the barrel cortex were mostly transient and rarely survived for more than a week. However, when every other whisker was trimmed, the generation and loss of persistent spines was enhanced. Ultrastructural reconstruction of previously imaged spines and boutons showed that new spines slowly form synapses. New spines persisting for a few days always had synapses, whereas very young spines often lacked synapses. New synapses were predominantly found on large, multi-synapse boutons, suggesting that spine growth is followed by synapse formation, preferentially on existing boutons. Altogether our data indicate that novel sensory experience drives the stabilization of new spines on subclasses of cortical neurons and promotes the formation of new synapses. These synaptic changes likely underlie experience-dependent functional remodeling of specific neocortical circuits.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A first episode of depression after 65 years of age has long been associated with both severe macrovascular and small microvascular pathology. Among the three more frequent forms of depression in old age, post-stroke depression has been associated with an abrupt damage of cortical circuits involved in monoamine production and mood regulation. Late-onset depression (LOD) in the absence of stroke has been related to lacunes and white matter lesions that invade both the neocortex and subcortical nuclei. Recurrent late-life depression is thought to induce neuronal loss in the hippocampal formation and white matter lesions that affect limbic pathways. Despite an impressive number of magnetic resonance imaging (MRI) studies in this field, the presence of a causal relationship between structural changes in the human brain and LOD is still controversial. The present article provides a critical overview of the contribution of neuropathology in post-stroke, late-onset, and late-life recurrent depression. Recent autopsy findings challenge the role of stroke location in the occurrence of post-stroke depression by pointing to the deleterious effect of subcortical lacunes. Despite the lines of evidences supporting the association between MRI-assessed white matter changes and mood dysregulation, lacunes, periventricular and deep white matter demyelination are all unrelated to the occurrence of LOD. In the same line, neuropathological data show that early-onset depression is not associated with an acceleration of aging-related neurodegenerative changes in the human brain. However, they also provide data in favor of the neurotoxic theory of depression by showing that neuronal loss occurs in the hippocampus of chronically depressed patients. These three paradigms are discussed in the light of the complex relationships between psychosocial determinants and biological vulnerability in affective disorders.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The dynamical analysis of large biological regulatory networks requires the development of scalable methods for mathematical modeling. Following the approach initially introduced by Thomas, we formalize the interactions between the components of a network in terms of discrete variables, functions, and parameters. Model simulations result in directed graphs, called state transition graphs. We are particularly interested in reachability properties and asymptotic behaviors, which correspond to terminal strongly connected components (or "attractors") in the state transition graph. A well-known problem is the exponential increase of the size of state transition graphs with the number of network components, in particular when using the biologically realistic asynchronous updating assumption. To address this problem, we have developed several complementary methods enabling the analysis of the behavior of large and complex logical models: (i) the definition of transition priority classes to simplify the dynamics; (ii) a model reduction method preserving essential dynamical properties, (iii) a novel algorithm to compact state transition graphs and directly generate compressed representations, emphasizing relevant transient and asymptotic dynamical properties. The power of an approach combining these different methods is demonstrated by applying them to a recent multilevel logical model for the network controlling CD4+ T helper cell response to antigen presentation and to a dozen cytokines. This model accounts for the differentiation of canonical Th1 and Th2 lymphocytes, as well as of inflammatory Th17 and regulatory T cells, along with many hybrid subtypes. All these methods have been implemented into the software GINsim, which enables the definition, the analysis, and the simulation of logical regulatory graphs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Pyochelin (Pch) and enantiopyochelin (EPch) are enantiomeric siderophores, with three chiral centers, produced under iron limitation conditions by Pseudomonas aeruginosa and Pseudomonas fluorescens , respectively. After iron chelation in the extracellular medium, Pch-Fe and EPch-Fe are recognized and transported by their specific outer-membrane transporters: FptA in P. aeruginosa and FetA in P. fluorescens . Structural analysis of FetA-EPch-Fe and FptA-Pch-Fe, combined with mutagenesis and docking studies revealed the structural basis of the stereospecific recognition of these enantiomers by their respective transporters. Whereas FetA and FptA have a low sequence identity but high structural homology, the Pch and EPch binding pockets do not share any structural homology, but display similar physicochemical properties. The stereospecific recognition of both enantiomers by their corresponding transporters is imposed by the configuration of the siderophore's C4'' and C2'' chiral centers. This recognition involves specific hydrogen bonds between the Arg91 guanidinium group and EPch-Fe for FetA and between the Leu117-Leu116 main chain and Pch-Fe for FptA. FetA and FptA are the first membrane receptors to be structurally described with opposite binding enantioselectivities for their ligands, giving insights into the structural basis of their enantiospecificity.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

From toddler to late teenager, the macroscopic pattern of axonal projections in the human brain remains largely unchanged while undergoing dramatic functional modifications that lead to network refinement. These functional modifications are mediated by increasing myelination and changes in axonal diameter and synaptic density, as well as changes in neurochemical mediators. Here we explore the contribution of white matter maturation to the development of connectivity between ages 2 and 18 y using high b-value diffusion MRI tractography and connectivity analysis. We measured changes in connection efficacy as the inverse of the average diffusivity along a fiber tract. We observed significant refinement in specific metrics of network topology, including a significant increase in node strength and efficiency along with a decrease in clustering. Major structural modules and hubs were in place by 2 y of age, and they continued to strengthen their profile during subsequent development. Recording resting-state functional MRI from a subset of subjects, we confirmed a positive correlation between structural and functional connectivity, and in addition observed that this relationship strengthened with age. Continuously increasing integration and decreasing segregation of structural connectivity with age suggests that network refinement mediated by white matter maturation promotes increased global efficiency. In addition, the strengthening of the correlation between structural and functional connectivity with age suggests that white matter connectivity in combination with other factors, such as differential modulation of axonal diameter and myelin thickness, that are partially captured by inverse average diffusivity, play an increasingly important role in creating brain-wide coherence and synchrony.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Analysis of gas emissions by the input-output subsystem approach provides detailed insight into pollution generation in an economy. Structural decomposition analysis, on the other hand, identifies the factors behind the changes in key variables over time. Extending the input-output subsystem model to account for the changes in these variables reveals the channels by which environmental burdens are caused and transmitted throughout the production system. In this paper we propose a decomposition of the changes in the components of CO2 emissions captured by an input-output subsystems representation. The empirical application is for the Spanish service sector, and the economic and environmental data are for years 1990 and 2000. Our results show that services increased their CO2 emissions mainly because of a rise in emissions generated by non-services to cover the final demand for services. In all service activities, the decomposed effects show an increase in CO2 emissions due to a decrease in emission coefficients (i.e., emissions per unit of output) compensated by an increase in emissions caused both by the input-output coefficients and the rise in demand for services. Finally, large asymmetries exist not only in the quantitative changes in the CO2 emissions of the various services but also in the decomposed effects of these changes. Keywords: structural decomposition analysis, input-output subsystems, CO2 emissions, service sector.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

An ab initio structure prediction approach adapted to the peptide-major histocompatibility complex (MHC) class I system is presented. Based on structure comparisons of a large set of peptide-MHC class I complexes, a molecular dynamics protocol is proposed using simulated annealing (SA) cycles to sample the conformational space of the peptide in its fixed MHC environment. A set of 14 peptide-human leukocyte antigen (HLA) A0201 and 27 peptide-non-HLA A0201 complexes for which X-ray structures are available is used to test the accuracy of the prediction method. For each complex, 1000 peptide conformers are obtained from the SA sampling. A graph theory clustering algorithm based on heavy atom root-mean-square deviation (RMSD) values is applied to the sampled conformers. The clusters are ranked using cluster size, mean effective or conformational free energies, with solvation free energies computed using Generalized Born MV 2 (GB-MV2) and Poisson-Boltzmann (PB) continuum models. The final conformation is chosen as the center of the best-ranked cluster. With conformational free energies, the overall prediction success is 83% using a 1.00 Angstroms crystal RMSD criterion for main-chain atoms, and 76% using a 1.50 Angstroms RMSD criterion for heavy atoms. The prediction success is even higher for the set of 14 peptide-HLA A0201 complexes: 100% of the peptides have main-chain RMSD values < or =1.00 Angstroms and 93% of the peptides have heavy atom RMSD values < or =1.50 Angstroms. This structure prediction method can be applied to complexes of natural or modified antigenic peptides in their MHC environment with the aim to perform rational structure-based optimizations of tumor vaccines.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Empirical modeling of exposure levels has been popular for identifying exposure determinants in occupational hygiene. Traditional data-driven methods used to choose a model on which to base inferences have typically not accounted for the uncertainty linked to the process of selecting the final model. Several new approaches propose making statistical inferences from a set of plausible models rather than from a single model regarded as 'best'. This paper introduces the multimodel averaging approach described in the monograph by Burnham and Anderson. In their approach, a set of plausible models are defined a priori by taking into account the sample size and previous knowledge of variables influent on exposure levels. The Akaike information criterion is then calculated to evaluate the relative support of the data for each model, expressed as Akaike weight, to be interpreted as the probability of the model being the best approximating model given the model set. The model weights can then be used to rank models, quantify the evidence favoring one over another, perform multimodel prediction, estimate the relative influence of the potential predictors and estimate multimodel-averaged effects of determinants. The whole approach is illustrated with the analysis of a data set of 1500 volatile organic compound exposure levels collected by the Institute for work and health (Lausanne, Switzerland) over 20 years, each concentration having been divided by the relevant Swiss occupational exposure limit and log-transformed before analysis. Multimodel inference represents a promising procedure for modeling exposure levels that incorporates the notion that several models can be supported by the data and permits to evaluate to a certain extent model selection uncertainty, which is seldom mentioned in current practice.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We study the properties of the well known Replicator Dynamics when applied to a finitely repeated version of the Prisoners' Dilemma game. We characterize the behavior of such dynamics under strongly simplifying assumptions (i.e. only 3 strategies are available) and show that the basin of attraction of defection shrinks as the number of repetitions increases. After discussing the difficulties involved in trying to relax the 'strongly simplifying assumptions' above, we approach the same model by means of simulations based on genetic algorithms. The resulting simulations describe a behavior of the system very close to the one predicted by the replicator dynamics without imposing any of the assumptions of the mathematical model. Our main conclusion is that mathematical and computational models are good complements for research in social sciences. Indeed, while computational models are extremely useful to extend the scope of the analysis to complex scenarios hard to analyze mathematically, formal models can be useful to verify and to explain the outcomes of computational models.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We use a dynamic factor model to provide a semi-structural representation for 101 quarterly US macroeconomic series. We find that (i) the US economy is well described by a number of structural shocks between two and six. Focusing on the four-shock specification, we identify, using sign restrictions, two non-policy shocks, demand and supply, and two policy shocks, monetary and fiscal. We obtain the following results. (ii) Both supply and demand shocks are important sources of fluctuations; supply prevails for GDP, while demand prevails for employment and inflation. (ii) Policy matters, Both monetary and fiscal policy shocks have sizeable effects on output and prices, with little evidence of crowding out; both monetary and fiscal authorities implement important systematic countercyclical policies reacting to demand shocks. (iii) Negative demand shocks have a large long-run positive effect on productivity, consistently with the Schumpeterian "cleansing" view of recessions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We derive necessary and sufficient conditions under which a set of variables is informationally sufficient, i.e. it contains enough information to estimate the structural shocks with a VAR model. Based on such conditions, we suggest a procedure to test for informational sufficiency. Moreover, we show how to amend the VAR if informational sufficiency is rejected. We apply our procedure to a VAR including TFP, unemployment and per-capita hours worked. We find that the three variables are not informationally sufficient. When adding missing information, the effects of technology shocks change dramatically.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Coltop3D is a software that performs structural analysis by using digital elevation model (DEM) and 3D point clouds acquired with terrestrial laser scanners. A color representation merging slope aspect and slope angle is used in order to obtain a unique code of color for each orientation of a local slope. Thus a continuous planar structure appears in a unique color. Several tools are included to create stereonets, to draw traces of discontinuities, or to compute automatically density stereonet. Examples are shown to demonstrate the efficiency of the method.