951 resultados para vector quantization based Gaussian modeling
Resumo:
Transparency document related to this article can be found online at http://dx.doi.org/10.1016/j.bbrc.2015.10.014
Resumo:
Among the largest resources for biological sequence data is the large amount of expressed sequence tags (ESTs) available in public and proprietary databases. ESTs provide information on transcripts but for technical reasons they often contain sequencing errors. Therefore, when analyzing EST sequences computationally, such errors must be taken into account. Earlier attempts to model error prone coding regions have shown good performance in detecting and predicting these while correcting sequencing errors using codon usage frequencies. In the research presented here, we improve the detection of translation start and stop sites by integrating a more complex mRNA model with codon usage bias based error correction into one hidden Markov model (HMM), thus generalizing this error correction approach to more complex HMMs. We show that our method maintains the performance in detecting coding sequences.
Resumo:
Introduction: Non-invasive brain imaging techniques often contrast experimental conditions across a cohort of participants, obfuscating distinctions in individual performance and brain mechanisms that are better characterised by the inter-trial variability. To overcome such limitations, we developed topographic analysis methods for single-trial EEG data [1]. So far this was typically based on time-frequency analysis of single-electrode data or single independent components. The method's efficacy is demonstrated for event-related responses to environmental sounds, hitherto studied at an average event-related potential (ERP) level. Methods: Nine healthy subjects participated to the experiment. Auditory meaningful sounds of common objects were used for a target detection task [2]. On each block, subjects were asked to discriminate target sounds, which were living or man-made auditory objects. Continuous 64-channel EEG was acquired during the task. Two datasets were considered for each subject including single-trial of the two conditions, living and man-made. The analysis comprised two steps. In the first part, a mixture of Gaussians analysis [3] provided representative topographies for each subject. In the second step, conditional probabilities for each Gaussian provided statistical inference on the structure of these topographies across trials, time, and experimental conditions. Similar analysis was conducted at group-level. Results: Results show that the occurrence of each map is structured in time and consistent across trials both at the single-subject and at group level. Conducting separate analyses of ERPs at single-subject and group levels, we could quantify the consistency of identified topographies and their time course of activation within and across participants as well as experimental conditions. A general agreement was found with previous analysis at average ERP level. Conclusions: This novel approach to single-trial analysis promises to have impact on several domains. In clinical research, it gives the possibility to statistically evaluate single-subject data, an essential tool for analysing patients with specific deficits and impairments and their deviation from normative standards. In cognitive neuroscience, it provides a novel tool for understanding behaviour and brain activity interdependencies at both single-subject and at group levels. In basic neurophysiology, it provides a new representation of ERPs and promises to cast light on the mechanisms of its generation and inter-individual variability.
Resumo:
The Mont Collon mafic complex is one of the best preserved examples of the Early Permian magmatism in the Central Alps, related to the intra-continental collapse of the Variscan belt. It mostly consists (> 95 vol.%) of ol+hy-nonnative plagioclase-wehrlites, olivine- and cpx-gabbros with cumulitic structures, crosscut by acid dikes. Pegmatitic gabbros, troctolites and anorthosites outcrop locally. A well-preserved cumulative, sequence is exposed in the Dents de Bertol area (center of intrusion). PT-calculations indicate that this layered magma chamber emplaced at mid-crustal levels at about 0.5 GPa and 1100 degrees C. The Mont Collon cumulitic rocks record little magmatic differentiation, as illustrated by the restricted range of clinopyroxene mg-number (Mg#(cpx)=83-89). Whole-rock incompatible trace-element contents (e.g. Nb, Zr, Ba) vary largely and without correlation with major-element composition. These features are characteristic of an in-situ crystallization process with variable amounts of interstitial liquid L trapped between the cumulus mineral phases. LA-ICPMS measurements show that trace-element distribution in the latter is homogeneous, pointing to subsolidus re-equilibration between crystals and interstitial melts. A quantitative modeling based on Langmuir's in-situ crystallization equation successfully duplicated the REE concentrations in cumulitic minerals of all rock facies of the intrusion. The calculated amounts of interstitial liquid L vary between 0 and 35% for degrees of differentiation F of 0 to 20%, relative to the least evolved facies of the intrusion. L values are well correlated with the modal proportions of interstitial amphibole and whole-rock incompatible trace-element concentrations (e.g. Zr, Nb) of the tested samples. However, the in-situ crystallization model reaches its limitations with rock containing high modal content of REE-bearing minerals (i.e. zircon), such as pegmatitic gabbros. Dikes of anorthositic composition, locally crosscutting the layered lithologies, evidence that the Mont Collon rocks evolved in open system with mixing of intercumulus liquids of different origins and possibly contrasting compositions. The proposed model is not able to resolve these complex open systems, but migrating liquids could be partly responsible for the observed dispersion of points in some correlation diagrams. Absence of significant differentiation with recurrent lithologies in the cumulitic pile of Dents de Bertol points to an efficiently convective magma chamber, with possible periodic replenishment, (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
CD8 T cells play a key role in mediating protective immunity against selected pathogens after vaccination. Understanding the mechanism of this protection is dependent upon definition of the heterogeneity and complexity of cellular immune responses generated by different vaccines. Here, we identify previously unrecognized subsets of CD8 T cells based upon analysis of gene-expression patterns within single cells and show that they are differentially induced by different vaccines. Three prime-boost vector combinations encoding HIV Env stimulated antigen-specific CD8 T-cell populations of similar magnitude, phenotype, and functionality. Remarkably, however, analysis of single-cell gene-expression profiles enabled discrimination of a majority of central memory (CM) and effector memory (EM) CD8 T cells elicited by the three vaccines. Subsets of T cells could be defined based on their expression of Eomes, Cxcr3, and Ccr7, or Klrk1, Klrg1, and Ccr5 in CM and EM cells, respectively. Of CM cells elicited by DNA prime-recombinant adenoviral (rAd) boost vectors, 67% were Eomes(-) Ccr7(+) Cxcr3(-), in contrast to only 7% and 2% stimulated by rAd5-rAd5 or rAd-LCMV, respectively. Of EM cells elicited by DNA-rAd, 74% were Klrk1(-) Klrg1(-)Ccr5(-) compared with only 26% and 20% for rAd5-rAd5 or rAd5-LCMV. Definition by single-cell gene profiling of specific CM and EM CD8 T-cell subsets that are differentially induced by different gene-based vaccines will facilitate the design and evaluation of vaccines, as well as enable our understanding of mechanisms of protective immunity.
Resumo:
The algorithmic approach to data modelling has developed rapidly these last years, in particular methods based on data mining and machine learning have been used in a growing number of applications. These methods follow a data-driven methodology, aiming at providing the best possible generalization and predictive abilities instead of concentrating on the properties of the data model. One of the most successful groups of such methods is known as Support Vector algorithms. Following the fruitful developments in applying Support Vector algorithms to spatial data, this paper introduces a new extension of the traditional support vector regression (SVR) algorithm. This extension allows for the simultaneous modelling of environmental data at several spatial scales. The joint influence of environmental processes presenting different patterns at different scales is here learned automatically from data, providing the optimum mixture of short and large-scale models. The method is adaptive to the spatial scale of the data. With this advantage, it can provide efficient means to model local anomalies that may typically arise in situations at an early phase of an environmental emergency. However, the proposed approach still requires some prior knowledge on the possible existence of such short-scale patterns. This is a possible limitation of the method for its implementation in early warning systems. The purpose of this paper is to present the multi-scale SVR model and to illustrate its use with an application to the mapping of Cs137 activity given the measurements taken in the region of Briansk following the Chernobyl accident.
Resumo:
During the past few decades, numerous plasmid vectors have been developed for cloning, gene expression analysis, and genetic engineering. Cloning procedures typically rely on PCR amplification, DNA fragment restriction digestion, recovery, and ligation, but increasingly, procedures are being developed to assemble large synthetic DNAs. In this study, we developed a new gene delivery system using the integrase activity of an integrative and conjugative element (ICE). The advantage of the integrase-based delivery is that it can stably introduce a large DNA fragment (at least 75 kb) into one or more specific sites (the gene for glycine-accepting tRNA) on a target chromosome. Integrase recombination activity in Escherichia coli is kept low by using a synthetic hybrid promoter, which, however, is unleashed in the final target host, forcing the integration of the construct. Upon integration, the system is again silenced. Two variants with different genetic features were produced, one in the form of a cloning vector in E. coli and the other as a mini-transposable element by which large DNA constructs assembled in E. coli can be tagged with the integrase gene. We confirmed that the system could successfully introduce cosmid and bacterial artificial chromosome (BAC) DNAs from E. coli into the chromosome of Pseudomonas putida in a site-specific manner. The integrase delivery system works in concert with existing vector systems and could thus be a powerful tool for synthetic constructions of new metabolic pathways in a variety of host bacteria.
Resumo:
The goal of the present work was assess the feasibility of using a pseudo-inverse and null-space optimization approach in the modeling of the shoulder biomechanics. The method was applied to a simplified musculoskeletal shoulder model. The mechanical system consisted in the arm, and the external forces were the arm weight, 6 scapulo-humeral muscles and the reaction at the glenohumeral joint, which was considered as a spherical joint. The muscle wrapping was considered around the humeral head assumed spherical. The dynamical equations were solved in a Lagrangian approach. The mathematical redundancy of the mechanical system was solved in two steps: a pseudo-inverse optimization to minimize the square of the muscle stress and a null-space optimization to restrict the muscle force to physiological limits. Several movements were simulated. The mathematical and numerical aspects of the constrained redundancy problem were efficiently solved by the proposed method. The prediction of muscle moment arms was consistent with cadaveric measurements and the joint reaction force was consistent with in vivo measurements. This preliminary work demonstrated that the developed algorithm has a great potential for more complex musculoskeletal modeling of the shoulder joint. In particular it could be further applied to a non-spherical joint model, allowing for the natural translation of the humeral head in the glenoid fossa.
Resumo:
Besides CYP2B6, other polymorphic enzymes contribute to efavirenz (EFV) interindividual variability. This study was aimed at quantifying the impact of multiple alleles on EFV disposition. Plasma samples from 169 human immunodeficiency virus (HIV) patients characterized for CYP2B6, CYP2A6, and CYP3A4/5 allelic diversity were used to build up a population pharmacokinetic model using NONMEM (non-linear mixed effects modeling), the aim being to seek a general approach combining genetic and demographic covariates. Average clearance (CL) was 11.3 l/h with a 65% interindividual variability that was explained largely by CYP2B6 genetic variation (31%). CYP2A6 and CYP3A4 had a prominent influence on CL, mostly when CYP2B6 was impaired. Pharmacogenetics fully accounted for ethnicity, leaving body weight as the only significant demographic factor influencing CL. Square roots of the numbers of functional alleles best described the influence of each gene, without interaction. Functional genetic variations in both principal and accessory metabolic pathways demonstrate a joint impact on EFV disposition. Therefore, dosage adjustment in accordance with the type of polymorphism (CYP2B6, CYP2A6, or CYP3A4) is required in order to maintain EFV within the therapeutic target levels.
Resumo:
In recent years, multi-atlas fusion methods have gainedsignificant attention in medical image segmentation. Inthis paper, we propose a general Markov Random Field(MRF) based framework that can perform edge-preservingsmoothing of the labels at the time of fusing the labelsitself. More specifically, we formulate the label fusionproblem with MRF-based neighborhood priors, as an energyminimization problem containing a unary data term and apairwise smoothness term. We present how the existingfusion methods like majority voting, global weightedvoting and local weighted voting methods can be reframedto profit from the proposed framework, for generatingmore accurate segmentations as well as more contiguoussegmentations by getting rid of holes and islands. Theproposed framework is evaluated for segmenting lymphnodes in 3D head and neck CT images. A comparison ofvarious fusion algorithms is also presented.
Resumo:
We study the asymmetric and dynamic dependence between financial assets and demonstrate, from the perspective of risk management, the economic significance of dynamic copula models. First, we construct stock and currency portfolios sorted on different characteristics (ex ante beta, coskewness, cokurtosis and order flows), and find substantial evidence of dynamic evolution between the high beta (respectively, coskewness, cokurtosis and order flow) portfolios and the low beta (coskewness, cokurtosis and order flow) portfolios. Second, using three different dependence measures, we show the presence of asymmetric dependence between these characteristic-sorted portfolios. Third, we use a dynamic copula framework based on Creal et al. (2013) and Patton (2012) to forecast the portfolio Value-at-Risk of long-short (high minus low) equity and FX portfolios. We use several widely used univariate and multivariate VaR models for the purpose of comparison. Backtesting our methodology, we find that the asymmetric dynamic copula models provide more accurate forecasts, in general, and, in particular, perform much better during the recent financial crises, indicating the economic significance of incorporating dynamic and asymmetric dependence in risk management.
Resumo:
PECUBE is a three-dimensional thermal-kinematic code capable of solving the heat production-diffusion-advection equation under a temporally varying surface boundary condition. It was initially developed to assess the effects of time-varying surface topography (relief) on low-temperature thermochronological datasets. Thermochronometric ages are predicted by tracking the time-temperature histories of rock-particles ending up at the surface and by combining these with various age-prediction models. In the decade since its inception, the PECUBE code has been under continuous development as its use became wider and addressed different tectonic-geomorphic problems. This paper describes several major recent improvements in the code, including its integration with an inverse-modeling package based on the Neighborhood Algorithm, the incorporation of fault-controlled kinematics, several different ways to address topographic and drainage change through time, the ability to predict subsurface (tunnel or borehole) data, prediction of detrital thermochronology data and a method to compare these with observations, and the coupling with landscape-evolution (or surface-process) models. Each new development is described together with one or several applications, so that the reader and potential user can clearly assess and make use of the capabilities of PECUBE. We end with describing some developments that are currently underway or should take place in the foreseeable future. (C) 2012 Elsevier B.V. All rights reserved.
Resumo:
Angiogenesis, the formation of new blood vessels sprouting from existing ones, occurs in several situations like wound healing, tissue remodeling, and near growing tumors. Under hypoxic conditions, tumor cells secrete growth factors, including VEGF. VEGF activates endothelial cells (ECs) in nearby vessels, leading to the migration of ECs out of the vessel and the formation of growing sprouts. A key process in angiogenesis is cellular self-organization, and previous modeling studies have identified mechanisms for producing networks and sprouts. Most theoretical studies of cellular self-organization during angiogenesis have ignored the interactions of ECs with the extra-cellular matrix (ECM), the jelly or hard materials that cells live in. Apart from providing structural support to cells, the ECM may play a key role in the coordination of cellular motility during angiogenesis. For example, by modifying the ECM, ECs can affect the motility of other ECs, long after they have left. Here, we present an explorative study of the cellular self-organization resulting from such ECM-coordinated cell migration. We show that a set of biologically-motivated, cell behavioral rules, including chemotaxis, haptotaxis, haptokinesis, and ECM-guided proliferation suffice for forming sprouts and branching vascular trees.
Resumo:
Drug delivery is one of the most common clinical routines in hospitals, and is critical to patients' health and recovery. It includes a decision making process in which a medical doctor decides the amount (dose) and frequency (dose interval) on the basis of a set of available patients' feature data and the doctor's clinical experience (a priori adaptation). This process can be computerized in order to make the prescription procedure in a fast, objective, inexpensive, non-invasive and accurate way. This paper proposes a Drug Administration Decision Support System (DADSS) to help clinicians/patients with the initial dose computing. The system is based on a Support Vector Machine (SVM) algorithm for estimation of the potential drug concentration in the blood of a patient, from which a best combination of dose and dose interval is selected at the level of a DSS. The addition of the RANdom SAmple Consensus (RANSAC) technique enhances the prediction accuracy by selecting inliers for SVM modeling. Experiments are performed for the drug imatinib case study which shows more than 40% improvement in the prediction accuracy compared with previous works. An important extension to the patient features' data is also proposed in this paper.
Resumo:
We study the properties of the well known Replicator Dynamics when applied to a finitely repeated version of the Prisoners' Dilemma game. We characterize the behavior of such dynamics under strongly simplifying assumptions (i.e. only 3 strategies are available) and show that the basin of attraction of defection shrinks as the number of repetitions increases. After discussing the difficulties involved in trying to relax the 'strongly simplifying assumptions' above, we approach the same model by means of simulations based on genetic algorithms. The resulting simulations describe a behavior of the system very close to the one predicted by the replicator dynamics without imposing any of the assumptions of the mathematical model. Our main conclusion is that mathematical and computational models are good complements for research in social sciences. Indeed, while computational models are extremely useful to extend the scope of the analysis to complex scenarios hard to analyze mathematically, formal models can be useful to verify and to explain the outcomes of computational models.