869 resultados para Coding Error Isolation


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Protein-coding genes evolve at different rates, and the influence of different parameters, from gene size to expression level, has been extensively studied. While in yeast gene expression level is the major causal factor of gene evolutionary rate, the situation is more complex in animals. Here we investigate these relations further, especially taking in account gene expression in different organs as well as indirect correlations between parameters. We used RNA-seq data from two large datasets, covering 22 mouse tissues and 27 human tissues. Over all tissues, evolutionary rate only correlates weakly with levels and breadth of expression. The strongest explanatory factors of purifying selection are GC content, expression in many developmental stages, and expression in brain tissues. While the main component of evolutionary rate is purifying selection, we also find tissue-specific patterns for sites under neutral evolution and for positive selection. We observe fast evolution of genes expressed in testis, but also in other tissues, notably liver, which are explained by weak purifying selection rather than by positive selection.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The prediction filters are well known models for signal estimation, in communications, control and many others areas. The classical method for deriving linear prediction coding (LPC) filters is often based on the minimization of a mean square error (MSE). Consequently, second order statistics are only required, but the estimation is only optimal if the residue is independent and identically distributed (iid) Gaussian. In this paper, we derive the ML estimate of the prediction filter. Relationships with robust estimation of auto-regressive (AR) processes, with blind deconvolution and with source separation based on mutual information minimization are then detailed. The algorithm, based on the minimization of a high-order statistics criterion, uses on-line estimation of the residue statistics. Experimental results emphasize on the interest of this approach.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A method for optimizing the strength of a parametric phase mask for a wavefront coding imaging system is presented. The method is based on an optimization process that minimizes a proposed merit function. The goal is to achieve modulation transfer function invariance while quantitatively maintaining nal image delity. A parametric lter that copes with the noise present in the captured images is used to obtain the nal images, and this lter is optimized. The whole process results in optimum phase mask strength and optimal parameters for the restoration lter. The results for a particular optical system are presented and tested experimentally in the labo- ratory. The experimental results show good agreement with the simulations, indicating that the procedure is useful.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This thesis studies evaluation of software development practices through an error analysis. The work presents software development process, software testing, software errors, error classification and software process improvement methods. The practical part of the work presents results from the error analysis of one software process. It also gives improvement ideas for the project. It was noticed that the classification of the error data was inadequate in the project. Because of this it was impossible to use the error data effectively. With the error analysis we were able to show that there were deficiencies in design and analyzing phases, implementation phase and in testing phase. The work gives ideas for improving error classification and for software development practices.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

En el sector suroriental de la Cuenca del Ebro, la inclinación paleomagnética obtenida en las sucesiones aluviales oligocenas es considerablemente menor que la esperable, si se considera la paleolatitud de referencia calculada para esa región durante el Oligoceno. Este error de inclinación puede deberse a diversos factores, como el control hidrodinámica de las partículas magnéticas en el medio deposicional, la compactación diferencial del sedimento durante el enterramiento, o bien a la deformación tectónica. Este trabajo se ha centrado en su estudio en dos sucesiones dominantemente aluviales, donde previamente se había establecido su magnetoestratigrafia. Las litofacies aluviales y lacustres estudiadas se han agrupado en cinco grupos: areniscas grises, areniscas rojas y versicolores, limos rojos, lutitas rojas y calizas. Se ha demostrado la existencia de una correlación entre la abundancia de filosilicatos y el error de inclinación. De esta manera, las litofacies con un bajo porcentaje de filosilicatos (calizas y areniscas grises) presentan errores de unos 5', estadisticarnente no significativos, con respecto a la inclinación de referencia. Por el contrario, en materiales con un porcentaje más elevado de filosilicatos (limos y arcillas) el error puede llegar a los 25'. Este hecho no tiene repercusión en la interpretación de las polaridades magnéticas, pero si en las reconstmcciones palinspásticas y paleogeográficas basadas en los cálculos de paleolatitudes a partir de las paleoinclinaciones. Los resultados obtenidos demuestran la necesidad de cautela en la propuesta de conclusiones basadas exclusivamente en este tipo de información.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

AIMS/HYPOTHESIS: Exposure of pancreatic beta cells to cytokines released by islet-infiltrating immune cells induces alterations in gene expression, leading to impaired insulin secretion and apoptosis in the initial phases of type 1 diabetes. Long non-coding RNAs (lncRNAs) are a new class of transcripts participating in the development of many diseases. As little is known about their role in insulin-secreting cells, this study aimed to evaluate their contribution to beta cell dysfunction. METHODS: The expression of lncRNAs was determined by microarray in the MIN6 beta cell line exposed to proinflammatory cytokines. The changes induced by cytokines were further assessed by real-time PCR in islets of control and NOD mice. The involvement of selected lncRNAs modified by cytokines was assessed after their overexpression in MIN6 cells and primary islet cells. RESULTS: MIN6 cells were found to express a large number of lncRNAs, many of which were modified by cytokine treatment. The changes in the level of selected lncRNAs were confirmed in mouse islets and an increase in these lncRNAs was also seen in prediabetic NOD mice. Overexpression of these lncRNAs in MIN6 and mouse islet cells, either alone or in combination with cytokines, favoured beta cell apoptosis without affecting insulin production or secretion. Furthermore, overexpression of lncRNA-1 promoted nuclear translocation of nuclear factor of κ light polypeptide gene enhancer in B cells 1 (NF-κB). CONCLUSIONS/INTERPRETATION: Our study shows that lncRNAs are modulated during the development of type 1 diabetes in NOD mice, and that their overexpression sensitises beta cells to apoptosis, probably contributing to their failure during the initial phases of the disease.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Barmumycin was isolated from an extract of the marine actinomycete Streptomyces sp. BOSC-022A and found to be cytotoxic against various human tumor cell lines. Based on preliminary one- and two-dimensional 1H- and 13C-NMR spectra, the natural compound was initially assigned the structure of macrolactone-type compound 1, which was later prepared by two different routes. However, major spectroscopic differences between isolated barmumycin and 1 led to revision of the proposed structure as E-16. Based on synthesis of this new compound, and subsequent spectroscopic comparison of it to an authentic sample of barmumycin, the structure of the natural compound was indeed confirmed as that of E-16.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The discovery of long non-coding RNA (lncRNA) has dramatically altered our understanding of cancer. Here, we describe a comprehensive analysis of lncRNA alterations at transcriptional, genomic, and epigenetic levels in 5,037 human tumor specimens across 13 cancer types from The Cancer Genome Atlas. Our results suggest that the expression and dysregulation of lncRNAs are highly cancer type specific compared with protein-coding genes. Using the integrative data generated by this analysis, we present a clinically guided small interfering RNA screening strategy and a co-expression analysis approach to identify cancer driver lncRNAs and predict their functions. This provides a resource for investigating lncRNAs in cancer and lays the groundwork for the development of new diagnostics and treatments.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

PURPOSE: To meta-analyze the literature on the clinical performance of Class V restorations to assess the factors that influence retention, marginal integrity, and marginal discoloration of cervical lesions restored with composite resins, glass-ionomer-cement-based materials [glass-ionomer cement (GIC) and resin-modified glass ionomers (RMGICs)], and polyacid-modified resin composites (PMRC). MATERIALS AND METHODS: The English literature was searched (MEDLINE and SCOPUS) for prospective clinical trials on cervical restorations with an observation period of at least 18 months. The studies had to report about retention, marginal discoloration, marginal integrity, and marginal caries and include a description of the operative technique (beveling of enamel, roughening of dentin, type of isolation). Eighty-one studies involving 185 experiments for 47 adhesives matched the inclusion criteria. The statistical analysis was carried out by using the following linear mixed model: log (-log (Y /100)) = β + α log(T ) + error with β = log(λ), where β is a summary measure of the non-linear deterioration occurring in each experiment, including a random study effect. RESULTS: On average, 12.3% of the cervical restorations were lost, 27.9% exhibited marginal discoloration, and 34.6% exhibited deterioration of marginal integrity after 5 years. The calculation of the clinical index was 17.4% of failures after 5 years and 32.3% after 8 years. A higher variability was found for retention loss and marginal discoloration. Hardly any secondary caries lesions were detected, even in the experiments with a follow-up time longer than 8 years. Restorations placed using rubber-dam in teeth whose dentin was roughened showed a statistically significantly higher retention rate than those placed in teeth with unprepared dentin or without rubber-dam (p < 0.05). However, enamel beveling had no influence on any of the examined variables. Significant differences were found between pairs of adhesive systems and also between pairs of classes of adhesive systems. One-step self-etching had a significantly worse clinically index than two-step self-etching and three-step etch-and-rinse (p = 0.026 and p = 0.002, respectively). CONCLUSION: The clinical performance is significantly influenced by the type of adhesive system and/or the adhesive class to which the system belongs. Whether the dentin/enamel is roughened or not and whether rubberdam isolation is used or not also significantly influenced the clinical performance. Composite resin restorations placed with two-step self-etching and three-step etch-and-rinse adhesive systems should be preferred over onestep self-etching adhesive systems, GIC-based materials, and PMRCs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study used event-related brain potentials to investigate whether math anxiety is related to abnormal error monitoring processing. Seventeen high math-anxious (HMA) and seventeen low math-anxious (LMA) individuals were presented with a numerical and a classical Stroop task. Groups did not differ in terms of trait or state anxiety. We found enhanced error-related negativity (ERN) in the HMA group when subjects committed an error on the numerical Stroop task, but not on the classical Stroop task. Groups did not differ in terms of the correct-related negativity component (CRN), the error positivity component (Pe), classical behavioral measures or post-error measures. The amplitude of the ERN was negatively related to participants" math anxiety scores, showing a more negative amplitude as the score increased. Moreover, using standardized low resolution electromagnetic tomography (sLORETA) we found greater activation of the insula in errors on a numerical task as compared to errors in a nonnumerical task only for the HMA group. The results were interpreted according to the motivational significance theory of the ERN.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Ease of worldwide travel provides increased opportunities for organisms not only to colonize new environments but also to encounter related but diverged populations. Such events of reconnection and secondary contact of previously isolated populations are widely observed at different time scales. For example, during the quaternary glaciation, sea water level fluctuations caused temporal isolation of populations, often to be followed by secondary contact. At shorter time scales, population isolation and reconnection of viruses are commonly observed, and such events are often associated with epidemics and pandemics. Here, using coalescent theory and simulations, we describe the temporal impact of population reconnection after isolation on nucleotide differences and the site frequency spectrum, as well as common summary statistics of DNA variation. We identify robust genomic signatures of population reconnection after isolation. We utilize our development to infer the recent evolutionary history of human immunodeficiency virus 1 (HIV-1) in Asia and South America, successfully retrieving the successive HIV subtype colonization events in these regions. Our analysis reveals that divergent HIV-1 subtype populations are currently admixing in these regions, suggesting that HIV-1 may be undergoing a process of homogenization, contrary to popular belief.