897 resultados para Simulation-based methods
Resumo:
In the recent years, vibration-based structural damage identification has been subject of significant research in structural engineering. The basic idea of vibration-based methods is that damage induces mechanical properties changes that cause anomalies in the dynamic response of the structure, which measures allow to localize damage and its extension. Vibration measured data, such as frequencies and mode shapes, can be used in the Finite Element Model Updating in order to adjust structural parameters sensible at damage (e.g. Young’s Modulus). The novel aspect of this thesis is the introduction into the objective function of accurate measures of strains mode shapes, evaluated through FBG sensors. After a review of the relevant literature, the case of study, i.e. an irregular prestressed concrete beam destined for roofing of industrial structures, will be presented. The mathematical model was built through FE models, studying static and dynamic behaviour of the element. Another analytical model was developed, based on the ‘Ritz method’, in order to investigate the possible interaction between the RC beam and the steel supporting table used for testing. Experimental data, recorded through the contemporary use of different measurement techniques (optical fibers, accelerometers, LVDTs) were compared whit theoretical data, allowing to detect the best model, for which have been outlined the settings for the updating procedure.
Resumo:
La gestion intégrée de la ressource en eau implique de distinguer les parcours de l’eau qui sont accessibles aux sociétés de ceux qui ne le sont pas. Les cheminements de l’eau sont nombreux et fortement variables d’un lieu à l’autre. Il est possible de simplifier cette question en s’attardant plutôt aux deux destinations de l’eau. L’eau bleue forme les réserves et les flux dans l’hydrosystème : cours d’eau, nappes et écoulements souterrains. L’eau verte est le flux invisible de vapeur d’eau qui rejoint l’atmosphère. Elle inclut l’eau consommée par les plantes et l’eau dans les sols. Or, un grand nombre d’études ne portent que sur un seul type d’eau bleue, en ne s’intéressant généralement qu’au devenir des débits ou, plus rarement, à la recharge des nappes. Le portrait global est alors manquant. Dans un même temps, les changements climatiques viennent impacter ce cheminement de l’eau en faisant varier de manière distincte les différents composants de cycle hydrologique. L’étude réalisée ici utilise l’outil de modélisation SWAT afin de réaliser le suivi de toutes les composantes du cycle hydrologique et de quantifier l’impact des changements climatiques sur l’hydrosystème du bassin versant de la Garonne. Une première partie du travail a permis d’affiner la mise en place du modèle pour répondre au mieux à la problématique posée. Un soin particulier a été apporté à l’utilisation de données météorologiques sur grille (SAFRAN) ainsi qu’à la prise en compte de la neige sur les reliefs. Le calage des paramètres du modèle a été testé dans un contexte differential split sampling, en calant puis validant sur des années contrastées en terme climatique afin d’appréhender la robustesse de la simulation dans un contexte de changements climatiques. Cette étape a permis une amélioration substantielle des performances sur la période de calage (2000-2010) ainsi que la mise en évidence de la stabilité du modèle face aux changements climatiques. Par suite, des simulations sur une période d’un siècle (1960-2050) ont été produites puis analysées en deux phases : i) La période passée (1960-2000), basée sur les observations climatiques, a servi de période de validation à long terme du modèle sur la simulation des débits, avec de très bonnes performances. L’analyse des différents composants hydrologiques met en évidence un impact fort sur les flux et stocks d’eau verte, avec une diminution de la teneur en eau des sols et une augmentation importante de l’évapotranspiration. Les composantes de l’eau bleue sont principalement perturbées au niveau du stock de neige et des débits qui présentent tous les deux une baisse substantielle. ii) Des projections hydrologiques ont été réalisées (2010-2050) en sélectionnant une gamme de scénarios et de modèles climatiques issus d’une mise à l’échelle dynamique. L’analyse de simulation vient en bonne part confirmer les conclusions tirées de la période passée : un impact important sur l’eau verte, avec toujours une baisse de la teneur en eau des sols et une augmentation de l’évapotranspiration potentielle. Les simulations montrent que la teneur en eau des sols pendant la période estivale est telle qu’elle en vient à réduire les flux d’évapotranspiration réelle, mettant en évidence le possible déficit futur des stocks d’eau verte. En outre, si l’analyse des composantes de l’eau bleue montre toujours une diminution significative du stock de neige, les débits semblent cette fois en hausse pendant l’automne et l’hiver. Ces résultats sont un signe de l’«accélération» des composantes d’eau bleue de surface, probablement en relation avec l’augmentation des évènements extrêmes de précipitation. Ce travail a permis de réaliser une analyse des variations de la plupart des composantes du cycle hydrologique à l’échelle d’un bassin versant, confirmant l’importance de prendre en compte toutes ces composantes pour évaluer l’impact des changements climatiques et plus largement des changements environnementaux sur la ressource en eau.
Resumo:
Objectives: Mycological contamination of occupational environments can be a result of fungal spores’ dispersion in the air and on surfaces. Therefore, it is very important to assess it in both types of the samples. In the present study we assessed fungal contamination in the air and in the surface samples to show relevance of surfaces sampling in complementing the results obtained in the air samples. Material and Methods: In total, 42 settings were assessed by the analysis of air and surfaces samples. The settings were divided into settings with a high fungal load (7 poultry farms and 7 pig farms, 3 cork industries, 3 waste management plants, 2 wastewater treatment plants and 1 horse stable) and a low fungal load (10 hospital canteens, 8 college canteens and 1 maternity hospital). In addition to culture-based methods, molecular tools were also applied to detect fungal burden in the settings with a higher fungal load. Results: From the 218 sampling sites, 140 (64.2%) presented different species in the examined surfaces when compared with the species identified in the air. A positive association in the high fungal load settings was found between the presence of different species in the air and surfaces. Wastewater treatment plants constituted the setting with the highest number of different species between the air and surface. Conclusions: We observed that surfaces sampling and application of molecular tools showed the same efficacy of species detection in high fungal load settings, corroborating the fact that surface sampling is crucial for a correct and complete analysis of occupational scenarios.
Resumo:
Perturbation of natural ecosystems, namely by increasing freshwater use and its degradative use, as well as topsoil erosion by water of land-use production systems, have been emerging as topics of high environmental concern. Freshwater use has become a focus of attention in the last few years for all stakeholders involved in the production of goods, mainly agro-industrial and forest-based products, which are freshwater-intensive consumers, requiring large inputs of green and blue water. This thesis presents a global review on the available Water Footprint Assessment and Life Cycle Assessment (LCA)-based methods for measuring and assessing the environmental relevance of freshwater resources use, based on a life cycle perspective. Using some of the available midpoint LCA-based methods, the freshwater use-related impacts of a Portuguese wine (white ‘vinho verde’) were assessed. However, the relevance of environmental green water has been neglected because of the absence of a comprehensive impact assessment method associated with green water flows. To overcome this constraint, this thesis helps to improve and enhance the LCA-based methods by providing a midpoint and spatially explicit Life Cycle Impact Assessment (LCIA) method for assessing impacts on terrestrial green water flow and addressing reductions in surface blue water production caused by reductions in surface runoff due to land-use production systems. The applicability of the proposed method is illustrated by a case study on Eucalyptus globulus conducted in Portugal, as the growth of short rotation forestry is largely dependent on local precipitation. Topsoil erosion by water has been characterised as one of the most upsetting problems for rivers. Because of this, this thesis also focuses on the ecosystem impacts caused by suspended solids (SS) from topsoil erosion that reach freshwater systems. A framework to conduct a spatially distributed SS delivery to freshwater streams and a fate and effect LCIA method to derive site-specific characterisation factors (CFs) for endpoint damage on aquatic ecosystem diversity, namely on algae, macrophyte, and macroinvertebrates organisms, were developed. The applicability of this framework, combined with the derived site-specific CFs, is shown by conducting a case study on E. globulus stands located in Portugal as an example of a land use based system. A spatially explicit LCA assessment was shown to be necessary, since the impacts associated with both green water flows and SS vary greatly as a function of spatial location.
Resumo:
Relationship between organisms within an ecosystem is one of the main focuses in the study of ecology and evolution. For instance, host-parasite interactions have long been under close interest of ecology, evolutionary biology and conservation science, due to great variety of strategies and interaction outcomes. The monogenean ecto-parasites consist of a significant portion of flatworms. Gyrodactylus salaris is a monogenean freshwater ecto-parasite of Atlantic salmon (Salmo salar) whose damage can make fish to be prone to further bacterial and fungal infections. G. salaris is the only one parasite whose genome has been studied so far. The RNA-seq data analyzed in this thesis has already been annotated by using LAST. The RNA-seq data was obtained from Illumina sequencing i.e. yielded reads were assembled into 15777 transcripts. Last resulted in annotation of 46% transcripts and remaining were left unknown. This thesis work was started with whole data and annotation process was continued by the use of PANNZER, CDD and InterProScan. This annotation resulted in 56% successfully annotated sequences having parasite specific proteins identified. This thesis represents the first of Monogenean transcriptomic information which gives an important source for further research on this specie. Additionally, comparison of annotation methods interestingly revealed that description and domain based methods perform better than simple similarity search methods. Therefore it is more likely to suggest the use of these tools and databases for functional annotation. These results also emphasize the need for use of multiple methods and databases. It also highlights the need of more genomic information related to G. salaris.
Resumo:
This work presents the analysis of wave and turbulence measurements collected at a tidal energy site. A new method is introduced to produce more consistent and rigorous estimations of the velocity fluctuations power spectral densities. An analytical function is further proposed to fit the observed spectra and could be input to the numerical models predicting power production and structural loading on tidal turbines. Another new approach is developed to correct for the effect of the Doppler noise on the high frequencies power spectral densities. The analysis of velocity time series combining wave and turbulent contributions demonstrates that the turbulent motions are coherent throughout the water column, rendering the wave coherence-based methods not applicable to our dataset. To avoid this problem, an alternative approach relying on the pressure data collected by the ADCP is introduced and shows appreciable improvement in the wave-turbulence separation.
Development of a simple and fast “DNA extraction kit” for sea food identification and marine species
Resumo:
Seafood products fraud, the misrepresentation of them, have been discovered all around the world in different forms as false labeling, species substitution, short-weighting or over glazing in order to hide the correct identity, origin or weight of the seafood products. Due to the value of seafood products such as canned tuna, swordfish or grouper, these species are the subject of the commercial fraud is mainly there placement of valuable species with other little or no value species. A similar situation occurs with the shelled shrimp or shellfish that are reduced into pieces for the commercialization. Food fraud by species substitution is an emerging risk given the increasingly global food supply chain and the potential food safety issues. Economic food fraud is committed when food is deliberately placed on the market, for financial gain deceiving consumers (Woolfe, M. & Primrose, S. 2004). As a result of the increased demand and the globalization of the seafood supply, more fish species are encountered in the market. In this scenary, it becomes essential to unequivocally identify the species. The traditional taxonomy, based primarily on identification keys of species, has shown a number of limitations in the use of the distinctive features in many animal taxa, amplified when fish, crustacean or shellfish are commercially transformed. Many fish species show a similar texture, thus the certification of fish products is particularly important when fishes have undergone procedures which affect the overall anatomical structure, such as heading, slicing or filleting (Marko et al., 2004). The absence of morphological traits, a main characteristic usually used to identify animal species, represents a challenge and molecular identification methods are required. Among them, DNA-based methods are more frequently employed for food authentication (Lockley & Bardsley, 2000). In addition to food authentication and traceability, studies of taxonomy, population and conservation genetics as well as analysis of dietary habits and prey selection, also rely on genetic analyses including the DNA barcoding technology (Arroyave & Stiassny, 2014; Galimberti et al., 2013; Mafra, Ferreira, & Oliveira, 2008; Nicolé et al., 2012; Rasmussen & Morrissey, 2008), consisting in PCR amplification and sequencing of a COI mitochondrial gene specific region. The system proposed by P. Hebert et al. (2003) locates inside the mitochondrial COI gene (cytochrome oxidase subunit I) the bioidentification system useful in taxonomic identification of species (Lo Brutto et al., 2007). The COI region, used for genetic identification - DNA barcode - is short enough to allow, with the current technology, to decode sequence (the pairs of nucleotide bases) in a single step. Despite, this region only represents a tiny fraction of the mitochondrial DNA content in each cell, the COI region has sufficient variability to distinguish the majority of species among them (Biondo et al. 2016). This technique has been already employed to address the demand of assessing the actual identity and/or provenance of marketed products, as well as to unmask mislabelling and fraudulent substitutions, difficult to detect especially in manufactured seafood (Barbuto et al., 2010; Galimberti et al., 2013; Filonzi, Chiesa, Vaghi, & Nonnis Marzano, 2010). Nowadays,the research concerns the use of genetic markers to identify not only the species and/or varieties of fish, but also to identify molecular characters able to trace the origin and to provide an effective control tool forproducers and consumers as a supply chain in agreementwith local regulations.
Resumo:
The Graphical User Interface (GUI) is an integral component of contemporary computer software. A stable and reliable GUI is necessary for correct functioning of software applications. Comprehensive verification of the GUI is a routine part of most software development life-cycles. The input space of a GUI is typically large, making exhaustive verification difficult. GUI defects are often revealed by exercising parts of the GUI that interact with each other. It is challenging for a verification method to drive the GUI into states that might contain defects. In recent years, model-based methods, that target specific GUI interactions, have been developed. These methods create a formal model of the GUI’s input space from specification of the GUI, visible GUI behaviors and static analysis of the GUI’s program-code. GUIs are typically dynamic in nature, whose user-visible state is guided by underlying program-code and dynamic program-state. This research extends existing model-based GUI testing techniques by modelling interactions between the visible GUI of a GUI-based software and its underlying program-code. The new model is able to, efficiently and effectively, test the GUI in ways that were not possible using existing methods. The thesis is this: Long, useful GUI testcases can be created by examining the interactions between the GUI, of a GUI-based application, and its program-code. To explore this thesis, a model-based GUI testing approach is formulated and evaluated. In this approach, program-code level interactions between GUI event handlers will be examined, modelled and deployed for constructing long GUI testcases. These testcases are able to drive the GUI into states that were not possible using existing models. Implementation and evaluation has been conducted using GUITAR, a fully-automated, open-source GUI testing framework.
Resumo:
Many exchange rate papers articulate the view that instabilities constitute a major impediment to exchange rate predictability. In this thesis we implement Bayesian and other techniques to account for such instabilities, and examine some of the main obstacles to exchange rate models' predictive ability. We first consider in Chapter 2 a time-varying parameter model in which fluctuations in exchange rates are related to short-term nominal interest rates ensuing from monetary policy rules, such as Taylor rules. Unlike the existing exchange rate studies, the parameters of our Taylor rules are allowed to change over time, in light of the widespread evidence of shifts in fundamentals - for example in the aftermath of the Global Financial Crisis. Focusing on quarterly data frequency from the crisis, we detect forecast improvements upon a random walk (RW) benchmark for at least half, and for as many as seven out of 10, of the currencies considered. Results are stronger when we allow the time-varying parameters of the Taylor rules to differ between countries. In Chapter 3 we look closely at the role of time-variation in parameters and other sources of uncertainty in hindering exchange rate models' predictive power. We apply a Bayesian setup that incorporates the notion that the relevant set of exchange rate determinants and their corresponding coefficients, change over time. Using statistical and economic measures of performance, we first find that predictive models which allow for sudden, rather than smooth, changes in the coefficients yield significant forecast improvements and economic gains at horizons beyond 1-month. At shorter horizons, however, our methods fail to forecast better than the RW. And we identify uncertainty in coefficients' estimation and uncertainty about the precise degree of coefficients variability to incorporate in the models, as the main factors obstructing predictive ability. Chapter 4 focus on the problem of the time-varying predictive ability of economic fundamentals for exchange rates. It uses bootstrap-based methods to uncover the time-specific conditioning information for predicting fluctuations in exchange rates. Employing several metrics for statistical and economic evaluation of forecasting performance, we find that our approach based on pre-selecting and validating fundamentals across bootstrap replications generates more accurate forecasts than the RW. The approach, known as bumping, robustly reveals parsimonious models with out-of-sample predictive power at 1-month horizon; and outperforms alternative methods, including Bayesian, bagging, and standard forecast combinations. Chapter 5 exploits the predictive content of daily commodity prices for monthly commodity-currency exchange rates. It builds on the idea that the effect of daily commodity price fluctuations on commodity currencies is short-lived, and therefore harder to pin down at low frequencies. Using MIxed DAta Sampling (MIDAS) models, and Bayesian estimation methods to account for time-variation in predictive ability, the chapter demonstrates the usefulness of suitably exploiting such short-lived effects in improving exchange rate forecasts. It further shows that the usual low-frequency predictors, such as money supplies and interest rates differentials, typically receive little support from the data at monthly frequency, whereas MIDAS models featuring daily commodity prices are highly likely. The chapter also introduces the random walk Metropolis-Hastings technique as a new tool to estimate MIDAS regressions.
Resumo:
Streptococcus pneumoniae is a human pathobiont that colonizes the nasopharynx. S. pneumoniae is responsible for causing non-invasive and invasive disease such as otitis, pneumonia, meningitis, and sepsis, being a leading cause of infectious diseases worldwide. Due to similarities with closely related species sharing the same niche, it may be a challenge to correctly distinguish S. pneumoniae from its relatives when using only non-culture based methods such as real time PCR (qPCR). In 2007, a molecular method targeting the major autolysin (lytA) of S. pneumoniae by a qPCR assay was proposed by Carvalho and collaborators to identify pneumococcus. Since then, this method has been widely used worldwide. In 2013, the gene encoding for the ABC iron transporter lipoprotein PiaA, was proposed by Trzcinzki and collaborators to be used in parallel with the lytA qPCR assay. However, the presence of lytA gene homologues has been described in closely related species such as S. pseudopneumoniae and S. mitis and the presence of piaA gene is not ubiquitous between S. pneumoniae. The hyaluronate lyase gene (hylA) has been described to be ubiquitous in S. pneumoniae. This gene has not been used so far as a target for the identification of S. pneumoniae. The aims of our study were to evaluate the specificity, sensitivity, positive predicted value (PPV) and negative predicted value (NPV) of the lytA and piaA qPCR methods; design and implement a new assay targeting the hylA gene and evaluate the same parameters above described; analyze the assays independently and the possible combinations to access what is the best approach using qPCR to identify S. pneumoniae. A total of 278 previously characterized strains were tested: 61 S. pseudopneumoniae, 37 Viridans group strains, 30 type strains from other streptococcal species and 150 S. pneumoniae strains. The collection included both carriage and disease isolates. By Mulilocus Sequence Analysis (MLSA) we confirmed that strains of S. pseudopneumoniae could be misidentified as S. pneumoniae when lytA qPCR assay is used. The results showed that as a single target, lytA had the best combination of specificity, sensitivity, PPV and NPV being, 98.5%, 100.0%, 98.7% and 100.0% respectively. The combination of targets with the best values of specificity, sensibility, PPV and NPV were lytA and piaA, with 100.0%, 93.3%, 97.9% and 92.6%, respectively. Nonetheless by MLSA we confirmed that strains of S. pseudopneumoniae could be misidentified as S. pneumoniae and some capsulated (23F, 6B and 11A) and non-capsulated S. pneumoniae were not Identified using this assay. The hylA gene as a single target had the lowest PPV. Nonetheless it was capable to correctly identify all S. pneumoniae.
Resumo:
Presently avocado germplasm is conserved ex situ in the form of field repositories across the globe including Australia. The maintenance of germplasm in the field is costly, labour and land intensive, exposed to natural disasters and always at the risk of abiotic and biotic stresses. The aim of this study was to overcome these problems using cryopreservation to store avocado (Persea americana Mill.) somatic embryos (SE). Two vitrification-based methods of cryopreservation were optimised (cryovial and droplet-vitrification) using four avocado cultivars (‘A10′, ‘Reed’, ‘Velvick’ and ‘Duke-7′). SE of the four cultivars were stored for short-term (one hour) in liquid nitrogen using the cryovial-vitrification method and showed a viability of 91%, 73%, 86% and 80% respectively. While when using the droplet vitrification method viabilities of 100%, 85% and 93% were recorded for ‘A10′, ‘Reed’ and ‘Velvick’. For long-term storage, SE of cultivars ‘A10′, ‘Reed’ and ‘Velvick’ were successfully recovered with viability of 65–100% after 3 months of LN storage. For cultivar ‘Reed’ and ‘Velvick’ SE were recovered after 12 months of LN storage with viability of 67% and 59%, respectively. The outcome of this work contributes towards the establishment of a cryopreservation protocol that is applicable across multiple avocado cultivars.
Resumo:
Hintergrund: Helicobacter pylori (H. pylori) zählt trotz abnehmender Inzidenz zu den häufigsten bakteriellen Infektionskrankheiten des Menschen. Die Infektion mit H. pylori ist ein Risikofaktor für Krankheiten wie gastroduodenale Geschwüre, Magenkarzinomen und MALT (Mucosa Associated Lymphoid Tissue)-Lymphomen. Zur Diagnostik von H. pylori stehen verschiedene invasive und nichtinvasive Verfahren zur Verfügung. Der 13C-Harnstoff-Atemtest wird zur Kontrolle einer Eradikationstherapie empfohlen, kommt in der Primärdiagnostik von H. pylori derzeit jedoch nicht standardmäßig in Deutschland zum Einsatz. Fragestellung: Welchen medizinischen und gesundheitsökonomischen Nutzen hat die Untersuchung auf H. pylori-Besiedlung mittels 13C-Harnstoff-Atemtest in der Primärdiagnostik im Vergleich zu invasiven und nichtinvasiven diagnostischen Verfahren? Methodik: Basierend auf einer systematischen Literaturrecherche in Verbindung mit einer Handsuche werden Studien zur Testgüte und Kosten-Effektivität des 13C-Harnstoff-Atemtests im Vergleich zu anderen diagnostischen Verfahren zum primären Nachweis von H. pylori identifiziert. Es werden nur medizinische Studien eingeschlossen, die den 13C-Harnstoff-Atemtest direkt mit anderen H. pylori-Testverfahren vergleichen. Goldstandard ist eines oder eine Kombination der biopsiebasierten Testverfahren. Für die gesundheitsökonomische Beurteilung werden nur vollständige gesundheitsökonomische Evaluationsstudien einbezogen, bei denen die Kosten-Effektivität des 13C Harnstoff-Atemtests direkt mit anderen H. pylori-Testverfahren verglichen wird. Ergebnisse: Es werden 30 medizinische Studien für den vorliegenden Bericht eingeschlossen. Im Vergleich zum Immunglobulin G (IgG)-Test ist die Sensitivität des 13C-Harnstoff-Atemtests zwölfmal höher, sechsmal niedriger und einmal gleich, und die Spezifität 13-mal höher, dreimal niedriger und zweimal gleich. Im Vergleich zum Stuhl-Antigen-Test ist die Sensitivität des 13C-Harnstoff-Atemtests neunmal höher, dreimal niedriger und einmal gleich, und die Spezifität neunmal höher, zweimal niedriger und zweimal gleich. Im Vergleich zum Urease-Schnelltest sind die Sensitivität des 13C-Harnstoff-Atemtests viermal höher, dreimal niedriger und viermal gleich und die Spezifität fünfmal höher, fünfmal niedriger und einmal gleich. Im Vergleich mit der Histologie ist die Sensitivität des 13C-Harnstoff-Atemtests einmal höher und zweimal niedriger und die Spezifität zweimal höher und einmal niedriger. In je einem Vergleich zeigt sich kein Unterschied zwischen 13C-Harnstoff-Atemtest und 14C-Harnstoff-Atemtest, sowie eine niedrigere Sensitivität und höhere Spezifität im Vergleich zur Polymerase-Kettenreaktion (PCR). Inwieweit die beschriebenen Unterschiede statistisch signifikant sind, wird in sechs der 30 Studien angegeben. Es werden neun gesundheitsökonomische Evaluationen in dem vorliegenden Bericht berücksichtigt. Die Test-and-Treat-Strategie mittels 13C-Harnstoff-Atemtest wird in sechs Studien mit einem Test-and-Treat-Verfahren auf Basis der Serologie sowie in drei Studien mit einem Test-and-Treat-Verfahren auf Basis des Stuhl-Antigen-Tests verglichen. Dabei ist das Atemtestverfahren dreimal kosteneffektiv gegenüber der serologischen Methode und wird von der Stuhl-Antigen-Test-Strategie einmal dominiert. Vier Studien beinhalten einen Vergleich der Test-and -Treat-Strategie auf Basis des 13C-Harnstoff-Atemtests mit einer empirischen antisekretorischen Therapie, wobei sich das Atemtesverfahren zweimal als kosteneffektive Prozedur erweist und zwei Studien einen Vergleich mit einer empirischen Eradikationstherapie. In fünf Studien wird das Test-and-Treat-Verfahren mittels 13C-Harnstoff-Atemtest einer endoskopiebasierten Strategie gegenübergestellt. Zweimal dominiert die Atemteststrategie die endoskopische Prozedur und einmal wird sie von dieser Strategie dominiert. Diskussion:Sowohl die medizinischen als auch die ökonomischen Studien weisen mehr oder minder gravierende Mängel auf und liefern heterogene Ergebnisse. So werden in der Mehrzahl der medizinischen Studien keine Angaben zur statistischen Signifikanz der berichteten Unterschiede zwischen den jeweiligen Testverfahren gemacht. Im direkten Vergleich weist der 13C-Harnstoff-Atemtest überwiegend eine höhere Testgüte als der IgG und der Stuhl-Antigen-Test auf. Aus den Vergleichen mit dem Urease-Schnelltest lassen sich keine Tendenzen bezüglich der Sensitivität ableiten, wohingegen die Spezifität des 13C-Harnstoff-Atemtests höher einzuschätzen ist. Für die Vergleiche des 13C-Harnstoff-Atemtest mit der Histologie, dem 14C-Harnstoff-Atemtest und der PCR liegen zu wenige Ergebnisse vor. In der eingeschlossenen ökonomischen Literatur deuten einige Studienergebnisse auf eine Kosten-Effektivität der Test-and-Treat-Strategie mittels 13C-Harnstoff-Atemtest gegenüber dem Test-and-Treat-Verfahren auf Basis der Serologie und der empirischen antiskretorischen Therapie hin. Um Tendenzen bezüglich der Kosten-Effektivität der Atemteststrategie gegenüber der Test-and-Treat-Strategie mittels Stuhl-Antigen-Test sowie der empirischen Eradikationstherapie abzuleiten, mangelt es an validen Ergebnissen bzw. ökonomischer Evidenz. Die Untersuchungsresultate hinsichtlich eines Vergleichs mit endoskopiebasierten Verfahren fallen diesbezüglich zu heterogen aus. Insgesamt kann keines der ökonomischen Modelle der Komplexität des Managements von Patienten mit dyspeptischen Beschwerden gänzlich gerecht werden. Schlussfolgerungen/Empfehlungen: Zusammenfassend ist festzuhalten, dass die Studienlage zur medizinischen und ökonomischen Beurteilung des 13C-Harnstoff-Atemtests im Vergleich zu anderen diagnostischen Methoden nicht ausreichend ist, um den Atemtest als primärdiagnostisches Standardverfahren im Rahmen einer Test-and-Treat-Strategie beim Management von Patienten mit dyspeptischen Beschwerden für die deutsche Versorgungslandschaft insbesondere vor dem Hintergrund der Leitlinien der Deutschen Gesellschaft für Verdauungs- und Stoffwechselkrankheiten (DGVS) anstelle einer endoskopiebasierten Methode zu empfehlen.
Resumo:
Forty-four species of Colletotrichum are confirmed as present in Australia based on DNA sequencing analyses. Many of these species were identified directly as a result of two workshops organised by the Subcommittee on Plant Health Diagnostics in Australia in 2015 that covered morphological and molecular approaches to identification of Colletotrichum. There are several other species of Colletotrichum reported from Australia that remain to be substantiated by DNA sequence-based methods. This body of work aims to provide a basis from which to critically examine a number of isolates of Colletotrichum deposited in Australian culture collections.
Resumo:
Este artículo propone una nueva estrategia de control basada en medidas continuas de glucosa y un controlador por modo deslizante que se habitúa (HSMC). El HSMC es desarrollado, combinando la ley de control por modo deslizante y los principios de control por habituación. El HSMC aplicado a la regulación de glucosa sanguínea en la unidad de cuidados intensivos, incluye tanto entrada de glucosa, como de infusión de insulina intravasculares a fin de proveer el suministro de nutrición y mejorar el rechazo a la perturbación. El estudio basado en simulaciones (in silico), usando un modelo fisiológico de la dinámica glucosa-insulina, muestra que la estrategia de control propuesta funciona apropiadamente. Finalmente, se compara el desempeño del controlador propuesto con respecto a un controlador PID estándar.
Resumo:
Manufacturing companies have passed from selling uniquely tangible products to adopting a service-oriented approach to generate steady and continuous revenue streams. Nowadays, equipment and machine manufacturers possess technologies to track and analyze product-related data for obtaining relevant information from customers’ use towards the product after it is sold. The Internet of Things on Industrial environments will allow manufacturers to leverage lifecycle product traceability for innovating towards an information-driven services approach, commonly referred as “Smart Services”, for achieving improvements in support, maintenance and usage processes. The aim of this study is to conduct a literature review and empirical analysis to present a framework that describes a customer-oriented approach for developing information-driven services leveraged by the Internet of Things in manufacturing companies. The empirical study employed tools for the assessment of customer needs for analyzing the case company in terms of information requirements and digital needs. The literature review supported the empirical analysis with a deep research on product lifecycle traceability and digitalization of product-related services within manufacturing value chains. As well as the role of simulation-based technologies on supporting the “Smart Service” development process. The results obtained from the case company analysis show that the customers mainly demand information that allow them to monitor machine conditions, machine behavior on different geographical conditions, machine-implement interactions, and resource and energy consumption. Put simply, information outputs that allow them to increase machine productivity for maximizing yields, save time and optimize resources in the most sustainable way. Based on customer needs assessment, this study presents a framework to describe the initial phases of a “Smart Service” development process, considering the requirements of Smart Engineering methodologies.