985 resultados para error rates


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Notre consommation en eau souterraine, en particulier comme eau potable ou pour l'irrigation, a considérablement augmenté au cours des années. De nombreux problèmes font alors leur apparition, allant de la prospection de nouvelles ressources à la remédiation des aquifères pollués. Indépendamment du problème hydrogéologique considéré, le principal défi reste la caractérisation des propriétés du sous-sol. Une approche stochastique est alors nécessaire afin de représenter cette incertitude en considérant de multiples scénarios géologiques et en générant un grand nombre de réalisations géostatistiques. Nous rencontrons alors la principale limitation de ces approches qui est le coût de calcul dû à la simulation des processus d'écoulements complexes pour chacune de ces réalisations. Dans la première partie de la thèse, ce problème est investigué dans le contexte de propagation de l'incertitude, oú un ensemble de réalisations est identifié comme représentant les propriétés du sous-sol. Afin de propager cette incertitude à la quantité d'intérêt tout en limitant le coût de calcul, les méthodes actuelles font appel à des modèles d'écoulement approximés. Cela permet l'identification d'un sous-ensemble de réalisations représentant la variabilité de l'ensemble initial. Le modèle complexe d'écoulement est alors évalué uniquement pour ce sousensemble, et, sur la base de ces réponses complexes, l'inférence est faite. Notre objectif est d'améliorer la performance de cette approche en utilisant toute l'information à disposition. Pour cela, le sous-ensemble de réponses approximées et exactes est utilisé afin de construire un modèle d'erreur, qui sert ensuite à corriger le reste des réponses approximées et prédire la réponse du modèle complexe. Cette méthode permet de maximiser l'utilisation de l'information à disposition sans augmentation perceptible du temps de calcul. La propagation de l'incertitude est alors plus précise et plus robuste. La stratégie explorée dans le premier chapitre consiste à apprendre d'un sous-ensemble de réalisations la relation entre les modèles d'écoulement approximé et complexe. Dans la seconde partie de la thèse, cette méthodologie est formalisée mathématiquement en introduisant un modèle de régression entre les réponses fonctionnelles. Comme ce problème est mal posé, il est nécessaire d'en réduire la dimensionnalité. Dans cette optique, l'innovation du travail présenté provient de l'utilisation de l'analyse en composantes principales fonctionnelles (ACPF), qui non seulement effectue la réduction de dimensionnalités tout en maximisant l'information retenue, mais permet aussi de diagnostiquer la qualité du modèle d'erreur dans cet espace fonctionnel. La méthodologie proposée est appliquée à un problème de pollution par une phase liquide nonaqueuse et les résultats obtenus montrent que le modèle d'erreur permet une forte réduction du temps de calcul tout en estimant correctement l'incertitude. De plus, pour chaque réponse approximée, une prédiction de la réponse complexe est fournie par le modèle d'erreur. Le concept de modèle d'erreur fonctionnel est donc pertinent pour la propagation de l'incertitude, mais aussi pour les problèmes d'inférence bayésienne. Les méthodes de Monte Carlo par chaîne de Markov (MCMC) sont les algorithmes les plus communément utilisés afin de générer des réalisations géostatistiques en accord avec les observations. Cependant, ces méthodes souffrent d'un taux d'acceptation très bas pour les problèmes de grande dimensionnalité, résultant en un grand nombre de simulations d'écoulement gaspillées. Une approche en deux temps, le "MCMC en deux étapes", a été introduite afin d'éviter les simulations du modèle complexe inutiles par une évaluation préliminaire de la réalisation. Dans la troisième partie de la thèse, le modèle d'écoulement approximé couplé à un modèle d'erreur sert d'évaluation préliminaire pour le "MCMC en deux étapes". Nous démontrons une augmentation du taux d'acceptation par un facteur de 1.5 à 3 en comparaison avec une implémentation classique de MCMC. Une question reste sans réponse : comment choisir la taille de l'ensemble d'entrainement et comment identifier les réalisations permettant d'optimiser la construction du modèle d'erreur. Cela requiert une stratégie itérative afin que, à chaque nouvelle simulation d'écoulement, le modèle d'erreur soit amélioré en incorporant les nouvelles informations. Ceci est développé dans la quatrième partie de la thèse, oú cette méthodologie est appliquée à un problème d'intrusion saline dans un aquifère côtier. -- Our consumption of groundwater, in particular as drinking water and for irrigation, has considerably increased over the years and groundwater is becoming an increasingly scarce and endangered resource. Nofadays, we are facing many problems ranging from water prospection to sustainable management and remediation of polluted aquifers. Independently of the hydrogeological problem, the main challenge remains dealing with the incomplete knofledge of the underground properties. Stochastic approaches have been developed to represent this uncertainty by considering multiple geological scenarios and generating a large number of realizations. The main limitation of this approach is the computational cost associated with performing complex of simulations in each realization. In the first part of the thesis, we explore this issue in the context of uncertainty propagation, where an ensemble of geostatistical realizations is identified as representative of the subsurface uncertainty. To propagate this lack of knofledge to the quantity of interest (e.g., the concentration of pollutant in extracted water), it is necessary to evaluate the of response of each realization. Due to computational constraints, state-of-the-art methods make use of approximate of simulation, to identify a subset of realizations that represents the variability of the ensemble. The complex and computationally heavy of model is then run for this subset based on which inference is made. Our objective is to increase the performance of this approach by using all of the available information and not solely the subset of exact responses. Two error models are proposed to correct the approximate responses follofing a machine learning approach. For the subset identified by a classical approach (here the distance kernel method) both the approximate and the exact responses are knofn. This information is used to construct an error model and correct the ensemble of approximate responses to predict the "expected" responses of the exact model. The proposed methodology makes use of all the available information without perceptible additional computational costs and leads to an increase in accuracy and robustness of the uncertainty propagation. The strategy explored in the first chapter consists in learning from a subset of realizations the relationship between proxy and exact curves. In the second part of this thesis, the strategy is formalized in a rigorous mathematical framework by defining a regression model between functions. As this problem is ill-posed, it is necessary to reduce its dimensionality. The novelty of the work comes from the use of functional principal component analysis (FPCA), which not only performs the dimensionality reduction while maximizing the retained information, but also allofs a diagnostic of the quality of the error model in the functional space. The proposed methodology is applied to a pollution problem by a non-aqueous phase-liquid. The error model allofs a strong reduction of the computational cost while providing a good estimate of the uncertainty. The individual correction of the proxy response by the error model leads to an excellent prediction of the exact response, opening the door to many applications. The concept of functional error model is useful not only in the context of uncertainty propagation, but also, and maybe even more so, to perform Bayesian inference. Monte Carlo Markov Chain (MCMC) algorithms are the most common choice to ensure that the generated realizations are sampled in accordance with the observations. Hofever, this approach suffers from lof acceptance rate in high dimensional problems, resulting in a large number of wasted of simulations. This led to the introduction of two-stage MCMC, where the computational cost is decreased by avoiding unnecessary simulation of the exact of thanks to a preliminary evaluation of the proposal. In the third part of the thesis, a proxy is coupled to an error model to provide an approximate response for the two-stage MCMC set-up. We demonstrate an increase in acceptance rate by a factor three with respect to one-stage MCMC results. An open question remains: hof do we choose the size of the learning set and identify the realizations to optimize the construction of the error model. This requires devising an iterative strategy to construct the error model, such that, as new of simulations are performed, the error model is iteratively improved by incorporating the new information. This is discussed in the fourth part of the thesis, in which we apply this methodology to a problem of saline intrusion in a coastal aquifer.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND AND AIMS: Smoking is a crucial environmental factor in inflammatory bowel disease (IBD). However, knowledge on patient characteristics associated with smoking, time trends of smoking rates, gender differences and supportive measures to cease smoking provided by physicians is scarce. We aimed to address these questions in Swiss IBD patients. METHODS: Prospectively obtained data from patients participating in the Swiss IBD cohort study was analysed and compared to the general Swiss population (GSP) matched by age, sex and year. RESULTS: Among a total of 1770 IBD patients analysed (49.1% male), 29% are current smokers. More than twice as many patients with Crohn's disease (CD) are active smokers compared to ulcerative colitis (UC, 39.6% vs. 15.3%, p<0.001). In striking contrast to the GSP, significantly more women than men with CD smoke (42.8% vs. 35.8%, p=0.025), with also an overall significantly increased smoking rate compared to the GSP in women but not men. The vast majority of smoking IBD patients (90.5%) claim to never have received any support to achieve smoking cessation, significantly more in UC compared to CD. We identify a significantly negative association of smoking and primary sclerosing cholangitis, indicative of a protective effect. Psychological distress in CD is significantly higher in smokers compared to non-smokers, but does not differ in UC CONCLUSIONS: Despite well-established detrimental effects, smoking rates in CD are alarmingly high with persistent and stagnating elevations compared to the GSP, especially in female patients. Importantly, there appears to be an unacceptable underuse of supportive measures to achieve smoking cessation.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Compared with usual care, noninvasive ventilation (NIV) lowers the risk of intubation and death for subjects with respiratory failure secondary to COPD exacerbations, but whether administration of NIV by a specialized, dedicated team improves its efficiency remains uncertain. Our aim was to test whether a dedicated team of respiratory therapists applying all acute NIV treatments would reduce the risk of intubation or death for subjects with COPD admitted for respiratory failure. METHODS: We carried out a retrospective study comparing subjects with COPD admitted to the ICU before (2001-2003) and after (2010-2012) the creation of a dedicated NIV team in a regional acute care hospital. The primary outcome was the risk of intubation or death. The secondary outcomes were the individual components of the primary outcome and ICU/hospital stay. RESULTS: A total of 126 subjects were included: 53 in the first cohort and 73 in the second. There was no significant difference in the demographic characteristics and severity of respiratory failure. Fifteen subjects (28.3%) died or had to undergo tracheal intubation in the first cohort, and only 10 subjects (13.7%) in the second cohort (odds ratio 0.40, 95% CI 0.16-0.99, P = .04). In-hospital mortality (15.1% vs 4.1%, P = .03) and median stay (ICU: 3.1 vs 1.9 d, P = .04; hospital: 11.5 vs 9.6 d, P = .04) were significantly lower in the second cohort, and a trend for a lower intubation risk was observed (20.8% vs 11% P = .13). CONCLUSIONS: The delivery of NIV by a dedicated team was associated with a lower risk of death or intubation in subjects with respiratory failure secondary to COPD exacerbations. Therefore, the implementation of a team administering all NIV treatments on a 24-h basis should be considered in institutions admitting subjects with COPD exacerbations.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This study used event-related brain potentials to investigate whether math anxiety is related to abnormal error monitoring processing. Seventeen high math-anxious (HMA) and seventeen low math-anxious (LMA) individuals were presented with a numerical and a classical Stroop task. Groups did not differ in terms of trait or state anxiety. We found enhanced error-related negativity (ERN) in the HMA group when subjects committed an error on the numerical Stroop task, but not on the classical Stroop task. Groups did not differ in terms of the correct-related negativity component (CRN), the error positivity component (Pe), classical behavioral measures or post-error measures. The amplitude of the ERN was negatively related to participants" math anxiety scores, showing a more negative amplitude as the score increased. Moreover, using standardized low resolution electromagnetic tomography (sLORETA) we found greater activation of the insula in errors on a numerical task as compared to errors in a nonnumerical task only for the HMA group. The results were interpreted according to the motivational significance theory of the ERN.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In this article we examine the convenience of dollarization for Ecuador today. As Ecuador is strongly integrated financially and commercially with the United States, the exchange rate pass-through should be zero. However, we sustain that rising rates of imports from trade partners other than the United States and subsequent real effective exchange rate depreciations are causing the pass-through to move away from zero. Here, in the framework of the Vector Error Correction Model, we analyse the impulse response function and variance decomposition of the inflation variable. We show that the developing economy of Ecuador is importing inflation from its main trading partners, most of them emerging countries with appreciated currencies. We argue that if Ecuador recovered both its monetary and exchange rate instruments it would be able to fight against inflation. We believe such an analysis could be extended to other countries with pegged exchange rate regimes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

QUESTIONS UNDER STUDY: Since tumour burden consumes substantial healthcare resources, precise cancer incidence estimations are pivotal to define future needs of national healthcare. This study aimed to estimate incidence and mortality rates of oesophageal, gastric, pancreatic, hepatic and colorectal cancers up to 2030 in Switzerland. METHODS: Swiss Statistics provides national incidences and mortality rates of various cancers, and models of future developments of the Swiss population. Cancer incidences and mortality rates from 1985 to 2009 were analysed to estimate trends and to predict incidence and mortality rates up to 2029. Linear regressions and Joinpoint analyses were performed to estimate the future trends of incidences and mortality rates. RESULTS: Crude incidences of oesophageal, pancreas, liver and colorectal cancers have steadily increased since 1985, and will continue to increase. Gastric cancer incidence and mortality rates reveal an ongoing decrease. Pancreatic and liver cancer crude mortality rates will keep increasing, whereas colorectal cancer mortality on the contrary will fall. Mortality from oesophageal cancer will plateau or minimally increase. If we consider European population-standardised incidence rates, oesophageal, pancreatic and colorectal cancer incidences are steady. Gastric cancers are diminishing and liver cancers will follow an increasing trend. Standardised mortality rates show a diminution for all but liver cancer. CONCLUSIONS: The oncological burden of gastrointestinal cancer will significantly increase in Switzerland during the next two decades. The crude mortality rates globally show an ongoing increase except for gastric and colorectal cancers. Enlarged healthcare resources to take care of these complex patient groups properly will be needed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Adjusting behavior following the detection of inappropriate actions allows flexible adaptation to task demands and environmental contingencies during goal-directed behaviors. Post-error behavioral adjustments typically consist in adopting more cautious response mode, which manifests as a slowing down of response speed. Although converging evidence involves the dorsolateral prefrontal cortex (DLPFC) in post-error behavioral adjustment, whether and when the left or right DLPFC is critical for post-error slowing (PES), as well as the underlying brain mechanisms, remain highly debated. To resolve these issues, we used single-pulse transcranial magnetic stimulation in healthy human adults to disrupt the left or right DLPFC selectively at various delays within the 30-180ms interval following false alarms commission, while participants preformed a standard visual Go/NoGo task. PES significantly increased after TMS disruption of the right, but not the left DLPFC at 150ms post-FA response. We discuss these results in terms of an involvement of the right DLPFC in reducing the detrimental effects of error detection on subsequent behavioral performance, as opposed to implementing adaptative error-induced slowing down of response speed.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Los análisis de Fourier permiten caracterizar el contorno del diente y obtener una serie de parámetros para un posterior análisis multivariante. Sin embargo, la gran complejidad que presentan algunas formas obliga a determinar el error de medición intrínseco que se produce. El objetivo de este trabajo es aplicar y validar los análisis de Fourier en el estudio de la forma dental del segundo molar inferior (M2) de cuatro especies de primates Hominoidea para explorar la variabilidad morfométrica interespecífica, así como determinar el error de medición a un nivel intra e interobservador. El contorno de la superficie oclusal del diente fue definido digitalmente y con las funciones derivadas del análisis de Fourier se realizaron Análisis Discriminantes y Test de Mantel (correlaciones de Pearson) para determinar las diferencias de forma a partir de las mediciones tomadas. Los resultados indican que el análisis de Fourier muestra la variabilidad de forma en dientes molares en especies de primates Hominoidea. Adicionalmente, los altos niveles de correlación a nivel intra (r>0,9) como interobservador (r>0,7) sugieren que la descripción morfométrica del diente a partir de métodos de Fourier realizados por diferentes observadores puede ser agrupada para posteriores análisis.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Nanoantennae show potential for photosynthesis research for two reasons; first by spatially confining light for experiments which require high spatial resolution, and second by enhancing the photon emission of single light-harvesting complexes. For effective use of nanoantennae a detailed understanding of the interaction between the nanoantenna and the light-harvesting complex is required. Here we report how the excitation and emission of multiple purple bacterial LH2s (light-harvesting complex 2) are controlled by single gold nanorod antennae. LH2 complexes were chemically attached to such antennae, and the antenna length was systematically varied to tune the resonance with respect to the LH2 absorption and emission. There are three main findings. (i) The polarization of the LH2 emission is fully controlled by the resonant nanoantenna. (ii) The largest fluorescence enhancement, of 23 times, is reached for excitation with light at λ = 850 nm, polarized along the long antenna-axis of the resonant antenna. The excitation enhancement is found to be 6 times, while the emission efficiency is increased 3.6 times. (iii) The fluorescence lifetime of LH2 depends strongly on the antenna length, with shortest lifetimes of [similar]40 ps for the resonant antenna. The lifetime shortening arises from an 11 times resonant enhancement of the radiative rate, together with a 2–3 times increase of the non-radiative rate, compared to the off-resonant antenna. The observed length dependence of radiative and non-radiative rate enhancement is in good agreement with simulations. Overall this work gives a complete picture of how the excitation and emission of multi-pigment light-harvesting complexes are influenced by a dipole nanoantenna.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Given the dual role of many plant traits to tolerate both herbivore attack and abiotic stress, the climatic niche of a species should be integrated into the study of plant defense strategies. Here we investigate the impact of plant reproductive strategy and components of species' climatic niche on the rate of chemical defense evolution in the milkweeds using a common garden experiment of 49 species. We found that across Asclepias species, clonal reproduction repeatedly evolved in lower temperature conditions, in species generally producing low concentrations of a toxic defense (cardenolides). Additionally, we found that rates of cardenolide evolution were lower for clonal than for nonclonal species. We thus conclude that because the clonal strategy is based on survival, long generation times, and is associated with tolerance of herbivory, it may be an alternative to toxicity in colder ecosystems. Taken together, these results indicate that the rate of chemical defense evolution is influenced by the intersection of life-history strategy and climatic niches into which plants radiate.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Using event-related brain potentials, the time course of error detection and correction was studied in healthy human subjects. A feedforward model of error correction was used to predict the timing properties of the error and corrective movements. Analysis of the multichannel recordings focused on (1) the error-related negativity (ERN) seen immediately after errors in response- and stimulus-locked averages and (2) on the lateralized readiness potential (LRP) reflecting motor preparation. Comparison of the onset and time course of the ERN and LRP components showed that the signs of corrective activity preceded the ERN. Thus, error correction was implemented before or at least in parallel with the appearance of the ERN component. Also, the amplitude of the ERN component was increased for errors, followed by fast corrective movements. The results are compatible with recent views considering the ERN component as the output of an evaluative system engaged in monitoring motor conflict.