40 resultados para EXPLOITING MULTICOMMUTATION
Resumo:
Metabolic problems lead to numerous failures during clinical trials, and much effort is now devoted to developing in silico models predicting metabolic stability and metabolites. Such models are well known for cytochromes P450 and some transferases, whereas less has been done to predict the activity of human hydrolases. The present study was undertaken to develop a computational approach able to predict the hydrolysis of novel esters by human carboxylesterase hCES2. The study involved first a homology modeling of the hCES2 protein based on the model of hCES1 since the two proteins share a high degree of homology (congruent with 73%). A set of 40 known substrates of hCES2 was taken from the literature; the ligands were docked in both their neutral and ionized forms using GriDock, a parallel tool based on the AutoDock4.0 engine which can perform efficient and easy virtual screening analyses of large molecular databases exploiting multi-core architectures. Useful statistical models (e.g., r (2) = 0.91 for substrates in their unprotonated state) were calculated by correlating experimental pK(m) values with distance between the carbon atom of the substrate's ester group and the hydroxy function of Ser228. Additional parameters in the equations accounted for hydrophobic and electrostatic interactions between substrates and contributing residues. The negatively charged residues in the hCES2 cavity explained the preference of the enzyme for neutral substrates and, more generally, suggested that ligands which interact too strongly by ionic bonds (e.g., ACE inhibitors) cannot be good CES2 substrates because they are trapped in the cavity in unproductive modes and behave as inhibitors. The effects of protonation on substrate recognition and the contrasting behavior of substrates and products were finally investigated by MD simulations of some CES2 complexes.
Resumo:
Tourette syndrome is a childhood-onset neuropsychiatric disorder with a high prevalence of attention deficit hyperactivity and obsessive-compulsive disorder co-morbidities. Structural changes have been found in frontal cortex and striatum in children and adolescents. A limited number of morphometric studies in Tourette syndrome persisting into adulthood suggest ongoing structural alterations affecting frontostriatal circuits. Using cortical thickness estimation and voxel-based analysis of T1- and diffusion-weighted structural magnetic resonance images, we examined 40 adults with Tourette syndrome in comparison with 40 age- and gender-matched healthy controls. Patients with Tourette syndrome showed relative grey matter volume reduction in orbitofrontal, anterior cingulate and ventrolateral prefrontal cortices bilaterally. Cortical thinning extended into the limbic mesial temporal lobe. The grey matter changes were modulated additionally by the presence of co-morbidities and symptom severity. Prefrontal cortical thickness reduction correlated negatively with tic severity, while volume increase in primary somatosensory cortex depended on the intensity of premonitory sensations. Orbitofrontal cortex volume changes were further associated with abnormal water diffusivity within grey matter. White matter analysis revealed changes in fibre coherence in patients with Tourette syndrome within anterior parts of the corpus callosum. The severity of motor tics and premonitory urges had an impact on the integrity of tracts corresponding to cortico-cortical and cortico-subcortical connections. Our results provide empirical support for a patho-aetiological model of Tourette syndrome based on developmental abnormalities, with perturbation of compensatory systems marking persistence of symptoms into adulthood. We interpret the symptom severity related grey matter volume increase in distinct functional brain areas as evidence of ongoing structural plasticity. The convergence of evidence from volume and water diffusivity imaging strengthens the validity of our findings and attests to the value of a novel multimodal combination of volume and cortical thickness estimations that provides unique and complementary information by exploiting their differential sensitivity to structural change.
Resumo:
In this paper we present the procedure we followed to develop the Italian Super Sense Tagger. In particular, we adapted the English SuperSense Tagger to the Italian Language by exploiting a parallel sense labeled corpus for training. As for English, the Italian tagger uses a fixed set of 26 semantic labels, called supersenses, achieving a slightly lower accuracy due to the lower quality of the Italian training data. Both taggers accomplish the same task of identifying entities and concepts belonging to a common set of ontological types. This parallelism allows us to define effective methodologies for a broad range of cross-language knowledge acquisition tasks.
Resumo:
Abstract Since its creation, the Internet has permeated our daily life. The web is omnipresent for communication, research and organization. This exploitation has resulted in the rapid development of the Internet. Nowadays, the Internet is the biggest container of resources. Information databases such as Wikipedia, Dmoz and the open data available on the net are a great informational potentiality for mankind. The easy and free web access is one of the major feature characterizing the Internet culture. Ten years earlier, the web was completely dominated by English. Today, the web community is no longer only English speaking but it is becoming a genuinely multilingual community. The availability of content is intertwined with the availability of logical organizations (ontologies) for which multilinguality plays a fundamental role. In this work we introduce a very high-level logical organization fully based on semiotic assumptions. We thus present the theoretical foundations as well as the ontology itself, named Linguistic Meta-Model. The most important feature of Linguistic Meta-Model is its ability to support the representation of different knowledge sources developed according to different underlying semiotic theories. This is possible because mast knowledge representation schemata, either formal or informal, can be put into the context of the so-called semiotic triangle. In order to show the main characteristics of Linguistic Meta-Model from a practical paint of view, we developed VIKI (Virtual Intelligence for Knowledge Induction). VIKI is a work-in-progress system aiming at exploiting the Linguistic Meta-Model structure for knowledge expansion. It is a modular system in which each module accomplishes a natural language processing task, from terminology extraction to knowledge retrieval. VIKI is a supporting system to Linguistic Meta-Model and its main task is to give some empirical evidence regarding the use of Linguistic Meta-Model without claiming to be thorough.
Resumo:
In this paper we study the relevance of multiple kernel learning (MKL) for the automatic selection of time series inputs. Recently, MKL has gained great attention in the machine learning community due to its flexibility in modelling complex patterns and performing feature selection. In general, MKL constructs the kernel as a weighted linear combination of basis kernels, exploiting different sources of information. An efficient algorithm wrapping a Support Vector Regression model for optimizing the MKL weights, named SimpleMKL, is used for the analysis. In this sense, MKL performs feature selection by discarding inputs/kernels with low or null weights. The approach proposed is tested with simulated linear and nonlinear time series (AutoRegressive, Henon and Lorenz series).
Resumo:
OBJECT: To study a scan protocol for coronary magnetic resonance angiography based on multiple breath-holds featuring 1D motion compensation and to compare the resulting image quality to a navigator-gated free-breathing acquisition. Image reconstruction was performed using L1 regularized iterative SENSE. MATERIALS AND METHODS: The effects of respiratory motion on the Cartesian sampling scheme were minimized by performing data acquisition in multiple breath-holds. During the scan, repetitive readouts through a k-space center were used to detect and correct the respiratory displacement of the heart by exploiting the self-navigation principle in image reconstruction. In vivo experiments were performed in nine healthy volunteers and the resulting image quality was compared to a navigator-gated reference in terms of vessel length and sharpness. RESULTS: Acquisition in breath-hold is an effective method to reduce the scan time by more than 30 % compared to the navigator-gated reference. Although an equivalent mean image quality with respect to the reference was achieved with the proposed method, the 1D motion compensation did not work equally well in all cases. CONCLUSION: In general, the image quality scaled with the robustness of the motion compensation. Nevertheless, the featured setup provides a positive basis for future extension with more advanced motion compensation methods.
Resumo:
We provide here a detailed protocol for studying the changes in electrical surface potential of leaves. This method has been developed over the years by plant physiologists and is currently used in different variants in many laboratories. The protocol records surface potential changes to measure long-distance electrical signals induced by diverse stimuli such as leaf wounding or current injection. This technique can be used to determine signaling speeds, to measure the connectivity between different plant organs and-by exploiting mutant plants-to identify transporters and ion channels involved in electrical signaling. The approach can be combined with the analysis of mRNA expression and of metabolite concentrations to correlate electrical signaling to specific physiological events. We describe how to use this protocol on Arabidopsis, looking at the effects of leaf wounding; however, it is broadly applicable to other plants and can be used to study other aspects of plant physiology. After wound infliction, surface potential recording takes ∼20 min per plant.
Resumo:
Quantification of short-echo time proton magnetic resonance spectroscopy results in >18 metabolite concentrations (neurochemical profile). Their quantification accuracy depends on the assessment of the contribution of macromolecule (MM) resonances, previously experimentally achieved by exploiting the several fold difference in T(1). To minimize effects of heterogeneities in metabolites T(1), the aim of the study was to assess MM signal contributions by combining inversion recovery (IR) and diffusion-weighted proton spectroscopy at high-magnetic field (14.1 T) and short echo time (= 8 msec) in the rat brain. IR combined with diffusion weighting experiments (with δ/Δ = 1.5/200 msec and b-value = 11.8 msec/μm(2)) showed that the metabolite nulled spectrum (inversion time = 740 msec) was affected by residuals attributed to creatine, inositol, taurine, choline, N-acetylaspartate as well as glutamine and glutamate. While the metabolite residuals were significantly attenuated by 50%, the MM signals were almost not affected (< 8%). The combination of metabolite-nulled IR spectra with diffusion weighting allows a specific characterization of MM resonances with minimal metabolite signal contributions and is expected to lead to a more precise quantification of the neurochemical profile.
Resumo:
Tractography algorithms provide us with the ability to non-invasively reconstruct fiber pathways in the white matter (WM) by exploiting the directional information described with diffusion magnetic resonance. These methods could be divided into two major classes, local and global. Local methods reconstruct each fiber tract iteratively by considering only directional information at the voxel level and its neighborhood. Global methods, on the other hand, reconstruct all the fiber tracts of the whole brain simultaneously by solving a global energy minimization problem. The latter have shown improvements compared to previous techniques but these algorithms still suffer from an important shortcoming that is crucial in the context of brain connectivity analyses. As no anatomical priors are usually considered during the reconstruction process, the recovered fiber tracts are not guaranteed to connect cortical regions and, as a matter of fact, most of them stop prematurely in the WM; this violates important properties of neural connections, which are known to originate in the gray matter (GM) and develop in the WM. Hence, this shortcoming poses serious limitations for the use of these techniques for the assessment of the structural connectivity between brain regions and, de facto, it can potentially bias any subsequent analysis. Moreover, the estimated tracts are not quantitative, every fiber contributes with the same weight toward the predicted diffusion signal. In this work, we propose a novel approach for global tractography that is specifically designed for connectivity analysis applications which: (i) explicitly enforces anatomical priors of the tracts in the optimization and (ii) considers the effective contribution of each of them, i.e., volume, to the acquired diffusion magnetic resonance imaging (MRI) image. We evaluated our approach on both a realistic diffusion MRI phantom and in vivo data, and also compared its performance to existing tractography algorithms.
Resumo:
BACKGROUND: New evidence shows that high density lipoproteins (HDL) have protective effects beyond their role in reverse cholesterol transport. Reconstituted HDL (rHDL) offer an attractive means of clinically exploiting these novel effects including cardioprotection against ischemia reperfusion injury (IRI). However, basic rHDL composition is limited to apolipoprotein AI (apoAI) and phospholipids; addition of bioactive compound may enhance its beneficial effects. OBJECTIVE: The aim of this study was to investigate the role of rHDL in post-ischemic model, and to analyze the potential impact of sphingosine-1-phosphate (S1P) in rHDL formulations. METHODS AND RESULTS: The impact of HDL on IRI was investigated using complementary in vivo, ex vivo and in vitro IRI models. Acute post-ischemic treatment with native HDL significantly reduced infarct size and cell death in the ex vivo, isolated heart (Langendorff) model and the in vivo model (-48%, p<0.01). Treatment with rHDL of basic formulation (apoAI + phospholipids) had a non-significant impact on cell death in vitro and on the infarct size ex vivo and in vivo. In contrast, rHDL containing S1P had a highly significant, protective influence ex vivo, and in vivo (-50%, p<0.01). This impact was comparable with the effects observed with native HDL. Pro-survival signaling proteins, Akt, STAT3 and ERK1/2 were similarly activated by HDL and rHDL containing S1P both in vitro (isolated cardiomyocytes) and in vivo. CONCLUSION: HDL afford protection against IRI in a clinically relevant model (post-ischemia). rHDL is significantly protective if supplemented with S1P. The protective impact of HDL appears to target directly the cardiomyocyte.
Resumo:
Arbuscular mycorrhizal fungi (AMF) are obligate symbionts with most terrestrial plants. They improve plant nutrition, particularly phosphate acquisition, and thus are able to improve plant growth. In exchange, the fungi obtain photosynthetically fixed carbon. AMF are coenocytic, meaning that many nuclei coexist in a common cytoplasm. Genetic exchange recently has been demonstrated in the AMF Glomus intraradices, allowing nuclei of different Glomus intraradices strains to mix. Such genetic exchange was shown previously to have negative effects on plant growth and to alter fungal colonization. However, no attempt was made to detect whether genetic exchange in AMF can alter plant gene expression and if this effect was time dependent. Here, we show that genetic exchange in AMF also can be beneficial for rice growth, and that symbiosis-specific gene transcription is altered by genetic exchange. Moreover, our results show that genetic exchange can change the dynamics of the colonization of the fungus in the plant. Our results demonstrate that the simple manipulation of the genetics of AMF can have important consequences for their symbiotic effects on plants such as rice, which is considered the most important crop in the world. Exploiting natural AMF genetic variation by generating novel AMF genotypes through genetic exchange is a potentially useful tool in the development of AMF inocula that are more beneficial for crop growth.
Resumo:
The stable co-existence of two haploid genotypes or two species is studied in a spatially heterogeneous environment submitted to a mixture of soft selection (within-patch regulation) and hard selection (outside-patch regulation) and where two kinds of resource are available. This is analysed both at an ecological time-scale (short term) and at an evolutionary time-scale (long term). At an ecological scale, we show that co-existence is very unlikely if the two competitors are symmetrical specialists exploiting different resources. In this case, the most favourable conditions are met when the two resources are equally available, a situation that should favour generalists at an evolutionary scale. Alternatively, low within-patch density dependence (soft selection) enhances the co-existence between two slightly different specialists of the most available resource. This results from the opposing forces that are acting in hard and soft regulation modes. In the case of unbalanced accessibility to the two resources, hard selection favours the most specialized genotype, whereas soft selection strongly favours the less specialized one. Our results suggest that competition for different resources may be difficult to demonstrate in the wild even when it is a key factor in the maintenance of adaptive diversity. At an evolutionary scale, a monomorphic invasive evolutionarily stable strategy (ESS) always exists. When a linear trade-off exists between survival in one habitat versus that in another, this ESS lies between an absolute adjustment of survival to niche size (for mainly soft-regulated populations) and absolute survival (specialization) in a single niche (for mainly hard-regulated populations). This suggests that environments in agreement with the assumptions of such models should lead to an absence of adaptive variation in the long term.
Resumo:
The evolution of eusociality, here defined as the emergence of societies with reproductive division of labour and cooperative brood care, was first seen as a challenge to Darwin's theory of evolution by natural selection. Why should individuals permanently forgo direct reproduction to help other individuals to reproduce? Kin selection, the indirect transmission of genes through relatives, is the key process explaining the evolution of permanently nonreproductive helpers. However, in some taxa helpers delay reproduction until a breeding opportunity becomes available. Overall, eusociality evolved when ecological conditions promote stable associations of related individuals that benefit from jointly exploiting and defending common resources. High levels of cooperation and robust mechanisms of division of labour are found in many animal societies. However, conflicts among individuals are still frequent when group members that are not genetically identical compete over reproduction or resource allocation.
Resumo:
Soil pollution with hexachlorocyclohexane (HCH) has caused serious environmental problems. Here we describe the targeted degradation of all HCH isomers by applying the aerobic bacterium Sphingobium indicum B90A. In particular, we examined possibilities for large-scale cultivation of strain B90A, tested immobilization, storage and inoculation procedures, and determined the survival and HCH-degradation activity of inoculated cells in soil. Optimal growth of strain B90A was achieved in glucose-containing mineral medium and up to 65% culturability could be maintained after 60 days storage at 30 degrees C by mixing cells with sterile dry corncob powder. B90A biomass produced in water supplemented with sugarcane molasses and immobilized on corncob powder retained 15-20% culturability after 30 days storage at 30 degrees C, whereas full culturability was maintained when cells were stored frozen at -20 degrees C. On the contrary, cells stored on corncob degraded gamma-HCH faster than those that had been stored frozen, with between 15 and 85% of gamma-HCH disappearance in microcosms within 20 h at 30 degrees C. Soil microcosm tests at 25 degrees C confirmed complete mineralization of [(14)C]-gamma-HCH by corncob-immobilized strain B90A. Experiments conducted in small pits and at an HCH-contaminated agricultural site resulted in between 85 and 95% HCH degradation by strain B90A applied via corncob, depending on the type of HCH isomer and even at residual HCH concentrations. Up to 20% of the inoculated B90A cells survived under field conditions after 8 days and could be traced among other soil microorganisms by a combination of natural antibiotic resistance properties, unique pigmentation and PCR amplification of the linA genes. Neither the addition of corncob nor of corncob immobilized B90A did measurably change the microbial community structure as determined by T-RFLP analysis. Overall, these results indicate that on-site aerobic bioremediation of HCH exploiting the biodegradation activity of S. indicum B90A cells stored on corncob powder is a promising technology.
Resumo:
Due to the advances in sensor networks and remote sensing technologies, the acquisition and storage rates of meteorological and climatological data increases every day and ask for novel and efficient processing algorithms. A fundamental problem of data analysis and modeling is the spatial prediction of meteorological variables in complex orography, which serves among others to extended climatological analyses, for the assimilation of data into numerical weather prediction models, for preparing inputs to hydrological models and for real time monitoring and short-term forecasting of weather.In this thesis, a new framework for spatial estimation is proposed by taking advantage of a class of algorithms emerging from the statistical learning theory. Nonparametric kernel-based methods for nonlinear data classification, regression and target detection, known as support vector machines (SVM), are adapted for mapping of meteorological variables in complex orography.With the advent of high resolution digital elevation models, the field of spatial prediction met new horizons. In fact, by exploiting image processing tools along with physical heuristics, an incredible number of terrain features which account for the topographic conditions at multiple spatial scales can be extracted. Such features are highly relevant for the mapping of meteorological variables because they control a considerable part of the spatial variability of meteorological fields in the complex Alpine orography. For instance, patterns of orographic rainfall, wind speed and cold air pools are known to be correlated with particular terrain forms, e.g. convex/concave surfaces and upwind sides of mountain slopes.Kernel-based methods are employed to learn the nonlinear statistical dependence which links the multidimensional space of geographical and topographic explanatory variables to the variable of interest, that is the wind speed as measured at the weather stations or the occurrence of orographic rainfall patterns as extracted from sequences of radar images. Compared to low dimensional models integrating only the geographical coordinates, the proposed framework opens a way to regionalize meteorological variables which are multidimensional in nature and rarely show spatial auto-correlation in the original space making the use of classical geostatistics tangled.The challenges which are explored during the thesis are manifolds. First, the complexity of models is optimized to impose appropriate smoothness properties and reduce the impact of noisy measurements. Secondly, a multiple kernel extension of SVM is considered to select the multiscale features which explain most of the spatial variability of wind speed. Then, SVM target detection methods are implemented to describe the orographic conditions which cause persistent and stationary rainfall patterns. Finally, the optimal splitting of the data is studied to estimate realistic performances and confidence intervals characterizing the uncertainty of predictions.The resulting maps of average wind speeds find applications within renewable resources assessment and opens a route to decrease the temporal scale of analysis to meet hydrological requirements. Furthermore, the maps depicting the susceptibility to orographic rainfall enhancement can be used to improve current radar-based quantitative precipitation estimation and forecasting systems and to generate stochastic ensembles of precipitation fields conditioned upon the orography.