896 resultados para Quality system
Resumo:
To compare the cost and effectiveness of the levonorgestrel-releasing intrauterine system (LNG-IUS) versus combined oral contraception (COC) and progestogens (PROG) in first-line treatment of dysfunctional uterine bleeding (DUB) in Spain. STUDY DESIGN: A cost-effectiveness and cost-utility analysis of LNG-IUS, COC and PROG was carried out using a Markov model based on clinical data from the literature and expert opinion. The population studied were women with a previous diagnosis of idiopathic heavy menstrual bleeding. The analysis was performed from the National Health System perspective, discounting both costs and future effects at 3%. In addition, a sensitivity analysis (univariate and probabilistic) was conducted. RESULTS: The results show that the greater efficacy of LNG-IUS translates into a gain of 1.92 and 3.89 symptom-free months (SFM) after six months of treatment versus COC and PROG, respectively (which represents an increase of 33% and 60% of symptom-free time). Regarding costs, LNG-IUS produces savings of 174.2-309.95 and 230.54-577.61 versus COC and PROG, respectively, after 6 months-5 years. Apart from cost savings and gains in SFM, quality-adjusted life months (QALM) are also favourable to LNG-IUS in all scenarios, with a range of gains between 1 and 2 QALM compared to COC and PROG. CONCLUSIONS: The results indicate that first-line use of the LNG-IUS is the dominant therapeutic option (less costly and more effective) in comparison with first-line use of COC or PROG for the treatment of DUB in Spain. LNG-IUS as first line is also the option that provides greatest health-related quality of life to patients.
Resumo:
The reason for this study is to propose a new quantitative approach on how to assess the quality of Open Access University Institutional Repositories. The results of this new approach are tested in the Spanish University Repositories. The assessment method is based in a binary codification of a proposal of features that objectively describes the repositories. The purposes of this method are assessing the quality and an almost automatically system for updating the data of the characteristics. First of all a database was created with the 38 Spanish institutional repositories. The variables of analysis are presented and explained either if they are coming from bibliography or are a set of new variables. Among the characteristics analyzed are the features of the software, the services of the repository, the features of the information system, the Internet visibility and the licenses of use. Results from Spanish universities ARE provided as a practical example of the assessment and for having a picture of the state of the development of the open access movement in Spain.
Resumo:
Line converters have become an attractive AC/DC power conversion solution in industrial applications. Line converters are based on controllable semiconductor switches, typically insulated gate bipolar transistors. Compared to the traditional diode bridge-based power converters line converters have many advantageous characteristics, including bidirectional power flow, controllable de-link voltage and power factor and sinusoidal line current. This thesis considers the control of the lineconverter and its application to power quality improving. The line converter control system studied is based on the virtual flux linkage orientation and the direct torque control (DTC) principle. A new DTC-based current control scheme is introduced and analyzed. The overmodulation characteristics of the DTC converter are considered and an analytical equation for the maximum modulation index is derived. The integration of the active filtering features to the line converter isconsidered. Three different active filtering methods are implemented. A frequency-domain method, which is based on selective harmonic sequence elimination, anda time-domain method, which is effective in a wider frequency band, are used inharmonic current compensation. Also, a voltage feedback active filtering method, which mitigates harmonic sequences of the grid voltage, is implemented. The frequency-domain and the voltage feedback active filtering control systems are analyzed and controllers are designed. The designs are verified with practical measurements. The performance and the characteristics of the implemented active filtering methods are compared and the effect of the L- and the LCL-type line filteris discussed. The importance of the correct grid impedance estimate in the voltage feedback active filter control system is discussed and a new measurement-based method to obtain it is proposed. Also, a power conditioning system (PCS) application of the line converter is considered. A new method for correcting the voltage unbalance of the PCS-fed island network is proposed and experimentally validated.
Resumo:
STUDY OBJECTIVES: Traditionally, sleep studies in mammals are performed using electroencephalogram/electromyogram (EEG/EMG) recordings to determine sleep-wake state. In laboratory animals, this requires surgery and recovery time and causes discomfort to the animal. In this study, we evaluated the performance of an alternative, noninvasive approach utilizing piezoelectric films to determine sleep and wakefulness in mice by simultaneous EEG/EMG recordings. The piezoelectric films detect the animal's movements with high sensitivity and the regularity of the piezo output signal, related to the regular breathing movements characteristic of sleep, serves to automatically determine sleep. Although the system is commercially available (Signal Solutions LLC, Lexington, KY), this is the first statistical validation of various aspects of sleep. DESIGN: EEG/EMG and piezo signals were recorded simultaneously during 48 h. SETTING: Mouse sleep laboratory. PARTICIPANTS: Nine male and nine female CFW outbred mice. INTERVENTIONS: EEG/EMG surgery. MEASUREMENTS AND RESULTS: The results showed a high correspondence between EEG/EMG-determined and piezo-determined total sleep time and the distribution of sleep over a 48-h baseline recording with 18 mice. Moreover, the piezo system was capable of assessing sleep quality (i.e., sleep consolidation) and interesting observations at transitions to and from rapid eye movement sleep were made that could be exploited in the future to also distinguish the two sleep states. CONCLUSIONS: The piezo system proved to be a reliable alternative to electroencephalogram/electromyogram recording in the mouse and will be useful for first-pass, large-scale sleep screens for genetic or pharmacological studies. CITATION: Mang GM, Nicod J, Emmenegger Y, Donohue KD, O'Hara BF, Franken P. Evaluation of a piezoelectric system as an alternative to electroencephalogram/electromyogram recordings in mouse sleep studies.
Resumo:
Due to the power of genetics, the mouse has become a widely used animal model in vision research. However, its eyeball has an axial length of only about 2 mm. The present protocol describes how to easily dissect the small rodent eye post mortem. This allows collecting different tissues of the eye, i.e., cornea, lens, iris, retina, optic nerve, retinal pigment epithelium (RPE), and sclera. We further describe in detail how to process these eye samples in order to obtain high‐quality RNA for RNA expression profiling studies. Depending on the eye tissue to be analyzed, we present appropriate lysis buffers to prepare total protein lysates for immunoblot and immuno‐precipitation analyses. Fixation, inclusion, embedding, and cryosectioning of the globe for routine histological analyses (HE staining, DAPI staining, immunohistochemistry, in situ hybridization) is further presented. These basic protocols should allow novice investigators to obtain eye tissue samples rapidly for their experiments.
Resumo:
Abstract:The objective of this work was to evaluate the effect of grazing intensity on the decomposition of cover crop pasture, dung, and soybean residues, as well as the C and N release rates from these residues in a long-term integrated soybean-beef cattle system under no-tillage. The experiment was initiated in 2001, with soybean cultivated in summer and black oat + Italian ryegrass in winter. The treatments consisted of four sward heights (10, 20, 30, and 40 cm), plus an ungrazed area, as the control. In 2009-2011, residues from pasture, dung, and soybean stems and leaves were placed in nylon-mesh litter bags and allowed to decompose for up to 258 days. With increasing grazing intensity, residual dry matter of the pasture decreased and that of dung increased. Pasture and dung lignin concentrations and C release rates were lower with moderate grazing intensity. C and N release rates from soybean residues are not affected by grazing intensity. The moderate grazing intensity produces higher quality residues, both for pasture and dung. Total C and N release is influenced by the greater residual dry matter produced when pastures were either lightly grazed or ungrazed.
Resumo:
Abstract: The objective of this work was to evaluate four cultivars of saccharine sorghum (Sorghum bicolor) regarding productivity, chemical composition of plant parts, and quality of the ensiling process. The tested varieties of saccharine sorghum were BRS 506, BRS 508, BRS 509, and BRS 511. The experiment was divided into two trials, which assessed: production, morphological composition, and nutritional quality of the saccharine varieties; and fermentation quality and nutritional value of the silage produced from the saccharine varieties. Of the tested varieties, BRS 509 and BRS 511 showed the highest total dry matter (DM) production. The BRS 508 variety presented the highest in vitro digestibility of the whole plant (70.65% DM). During ensiling, the BRS 509 variety showed the lowest DM loss (8.87%). The highest effluent production was observed for BRS 506 and BRS 508, with yields of 521.87 and 393.16 kg Mg-1 ensiled DM, respectively. The BRS 511 variety is the most recommended because of the best results for plant production and nutritional quality. Regarding the ensiling process, BRS 509 presents the lowest fermentation losses and the highest nutritional value of silage.
Resumo:
The application of the three voltage level 20/1/0.4 distribution system in Finland has proved to be an economic solution to enhance the reability of electricity distribution. By using 1 kV voltage level between medium and low voltage networks, the improvement in reability could be reached especially inaerial lines networks. Also considerable savings in investment and outage costscould be archieved compared to the traditional distribution system. This master's thesis is focused on the describing the situation in Russian distribution netwoks and consequent analyses the possibility of applying 1000V distribution system in Russia. The goal is to investigate on the basis of Finnish experience is any possible installation targets in Russia for the new system. Compatibility with Russian safety and quality standards are also studied in this thesis.
Resumo:
Clinical practice guidelines have become an important source of information to support clinicians in the management of individual patients. However, current guideline methods have limitations that include the lack of separating the quality of evidence from the strength of recommendations. The Grading of Recommendations, Assessment, Development and Evaluation (GRADE) working group, an international collaboration of guideline developers, methodologists, and clinicians have developed a system that addresses these shortcomings. Core elements include transparent methodology for grading the quality of evidence, the distinction between quality of the evidence and strength of a recommendation, an explicit balancing of benefits and harms of health care interventions, an explicit recognition of the values and preferences that underlie recommendations. The GRADE system has been piloted in various practice settings to ensure that it captures the complexity involved in evidence assessment and grading recommendations while maintaining simplicity and practicality. Many guideline organizations and medical societies have endorsed the system and adopted it for their guideline processes.
Resumo:
Un système efficace de sismique tridimensionnelle (3-D) haute-résolution adapté à des cibles lacustres de petite échelle a été développé. Dans le Lac Léman, près de la ville de Lausanne, en Suisse, des investigations récentes en deux dimension (2-D) ont mis en évidence une zone de faille complexe qui a été choisie pour tester notre système. Les structures observées incluent une couche mince (<40 m) de sédiments quaternaires sub-horizontaux, discordants sur des couches tertiaires de molasse pentées vers le sud-est. On observe aussi la zone de faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau de la Molasse Subalpine. Deux campagnes 3-D complètes, d?environ d?un kilomètre carré, ont été réalisées sur ce site de test. La campagne pilote (campagne I), effectuée en 1999 pendant 8 jours, a couvert 80 profils en utilisant une seule flûte. Pendant la campagne II (9 jours en 2001), le nouveau système trois-flûtes, bien paramétrés pour notre objectif, a permis l?acquisition de données de très haute qualité sur 180 lignes CMP. Les améliorations principales incluent un système de navigation et de déclenchement de tirs grâce à un nouveau logiciel. Celui-ci comprend un contrôle qualité de la navigation du bateau en temps réel utilisant un GPS différentiel (dGPS) à bord et une station de référence près du bord du lac. De cette façon, les tirs peuvent être déclenchés tous les 5 mètres avec une erreur maximale non-cumulative de 25 centimètres. Tandis que pour la campagne I la position des récepteurs de la flûte 48-traces a dû être déduite à partir des positions du bateau, pour la campagne II elle ont pu être calculées précisément (erreur <20 cm) grâce aux trois antennes dGPS supplémentaires placées sur des flotteurs attachés à l?extrémité de chaque flûte 24-traces. Il est maintenant possible de déterminer la dérive éventuelle de l?extrémité des flûtes (75 m) causée par des courants latéraux ou de petites variations de trajet du bateau. De plus, la construction de deux bras télescopiques maintenant les trois flûtes à une distance de 7.5 m les uns des autres, qui est la même distance que celle entre les lignes naviguées de la campagne II. En combinaison avec un espacement de récepteurs de 2.5 m, la dimension de chaque «bin» de données 3-D de la campagne II est de 1.25 m en ligne et 3.75 m latéralement. L?espacement plus grand en direction « in-line » par rapport à la direction «cross-line» est justifié par l?orientation structurale de la zone de faille perpendiculaire à la direction «in-line». L?incertitude sur la navigation et le positionnement pendant la campagne I et le «binning» imprécis qui en résulte, se retrouve dans les données sous forme d?une certaine discontinuité des réflecteurs. L?utilisation d?un canon à air à doublechambre (qui permet d?atténuer l?effet bulle) a pu réduire l?aliasing observé dans les sections migrées en 3-D. Celui-ci était dû à la combinaison du contenu relativement haute fréquence (<2000 Hz) du canon à eau (utilisé à 140 bars et à 0.3 m de profondeur) et d?un pas d?échantillonnage latéral insuffisant. Le Mini G.I 15/15 a été utilisé à 80 bars et à 1 m de profondeur, est mieux adapté à la complexité de la cible, une zone faillée ayant des réflecteurs pentés jusqu?à 30°. Bien que ses fréquences ne dépassent pas les 650 Hz, cette source combine une pénétration du signal non-aliasé jusqu?à 300 m dans le sol (par rapport au 145 m pour le canon à eau) pour une résolution verticale maximale de 1.1 m. Tandis que la campagne I a été acquise par groupes de plusieurs lignes de directions alternées, l?optimisation du temps d?acquisition du nouveau système à trois flûtes permet l?acquisition en géométrie parallèle, ce qui est préférable lorsqu?on utilise une configuration asymétrique (une source et un dispositif de récepteurs). Si on ne procède pas ainsi, les stacks sont différents selon la direction. Toutefois, la configuration de flûtes, plus courtes que pour la compagne I, a réduit la couverture nominale, la ramenant de 12 à 6. Une séquence classique de traitement 3-D a été adaptée à l?échantillonnage à haute fréquence et elle a été complétée par deux programmes qui transforment le format non-conventionnel de nos données de navigation en un format standard de l?industrie. Dans l?ordre, le traitement comprend l?incorporation de la géométrie, suivi de l?édition des traces, de l?harmonisation des «bins» (pour compenser l?inhomogénéité de la couverture due à la dérive du bateau et de la flûte), de la correction de la divergence sphérique, du filtrage passe-bande, de l?analyse de vitesse, de la correction DMO en 3-D, du stack et enfin de la migration 3-D en temps. D?analyses de vitesse détaillées ont été effectuées sur les données de couverture 12, une ligne sur deux et tous les 50 CMP, soit un nombre total de 600 spectres de semblance. Selon cette analyse, les vitesses d?intervalles varient de 1450-1650 m/s dans les sédiments non-consolidés et de 1650-3000 m/s dans les sédiments consolidés. Le fait que l?on puisse interpréter plusieurs horizons et surfaces de faille dans le cube, montre le potentiel de cette technique pour une interprétation tectonique et géologique à petite échelle en trois dimensions. On distingue cinq faciès sismiques principaux et leurs géométries 3-D détaillées sur des sections verticales et horizontales: les sédiments lacustres (Holocène), les sédiments glacio-lacustres (Pléistocène), la Molasse du Plateau, la Molasse Subalpine de la zone de faille (chevauchement) et la Molasse Subalpine au sud de cette zone. Les couches de la Molasse du Plateau et de la Molasse Subalpine ont respectivement un pendage de ~8° et ~20°. La zone de faille comprend de nombreuses structures très déformées de pendage d?environ 30°. Des tests préliminaires avec un algorithme de migration 3-D en profondeur avant sommation et à amplitudes préservées démontrent que la qualité excellente des données de la campagne II permet l?application de telles techniques à des campagnes haute-résolution. La méthode de sismique marine 3-D était utilisée jusqu?à présent quasi-exclusivement par l?industrie pétrolière. Son adaptation à une échelle plus petite géographiquement mais aussi financièrement a ouvert la voie d?appliquer cette technique à des objectifs d?environnement et du génie civil.<br/><br/>An efficient high-resolution three-dimensional (3-D) seismic reflection system for small-scale targets in lacustrine settings was developed. In Lake Geneva, near the city of Lausanne, Switzerland, past high-resolution two-dimensional (2-D) investigations revealed a complex fault zone (the Paudèze thrust zone), which was subsequently chosen for testing our system. Observed structures include a thin (<40 m) layer of subhorizontal Quaternary sediments that unconformably overlie southeast-dipping Tertiary Molasse beds and the Paudèze thrust zone, which separates Plateau and Subalpine Molasse units. Two complete 3-D surveys have been conducted over this same test site, covering an area of about 1 km2. In 1999, a pilot survey (Survey I), comprising 80 profiles, was carried out in 8 days with a single-streamer configuration. In 2001, a second survey (Survey II) used a newly developed three-streamer system with optimized design parameters, which provided an exceptionally high-quality data set of 180 common midpoint (CMP) lines in 9 days. The main improvements include a navigation and shot-triggering system with in-house navigation software that automatically fires the gun in combination with real-time control on navigation quality using differential GPS (dGPS) onboard and a reference base near the lake shore. Shots were triggered at 5-m intervals with a maximum non-cumulative error of 25 cm. Whereas the single 48-channel streamer system of Survey I requires extrapolation of receiver positions from the boat position, for Survey II they could be accurately calculated (error <20 cm) with the aid of three additional dGPS antennas mounted on rafts attached to the end of each of the 24- channel streamers. Towed at a distance of 75 m behind the vessel, they allow the determination of feathering due to cross-line currents or small course variations. Furthermore, two retractable booms hold the three streamers at a distance of 7.5 m from each other, which is the same distance as the sail line interval for Survey I. With a receiver spacing of 2.5 m, the bin dimension of the 3-D data of Survey II is 1.25 m in in-line direction and 3.75 m in cross-line direction. The greater cross-line versus in-line spacing is justified by the known structural trend of the fault zone perpendicular to the in-line direction. The data from Survey I showed some reflection discontinuity as a result of insufficiently accurate navigation and positioning and subsequent binning errors. Observed aliasing in the 3-D migration was due to insufficient lateral sampling combined with the relatively high frequency (<2000 Hz) content of the water gun source (operated at 140 bars and 0.3 m depth). These results motivated the use of a double-chamber bubble-canceling air gun for Survey II. A 15 / 15 Mini G.I air gun operated at 80 bars and 1 m depth, proved to be better adapted for imaging the complexly faulted target area, which has reflectors dipping up to 30°. Although its frequencies do not exceed 650 Hz, this air gun combines a penetration of non-aliased signal to depths of 300 m below the water bottom (versus 145 m for the water gun) with a maximum vertical resolution of 1.1 m. While Survey I was shot in patches of alternating directions, the optimized surveying time of the new threestreamer system allowed acquisition in parallel geometry, which is preferable when using an asymmetric configuration (single source and receiver array). Otherwise, resulting stacks are different for the opposite directions. However, the shorter streamer configuration of Survey II reduced the nominal fold from 12 to 6. A 3-D conventional processing flow was adapted to the high sampling rates and was complemented by two computer programs that format the unconventional navigation data to industry standards. Processing included trace editing, geometry assignment, bin harmonization (to compensate for uneven fold due to boat/streamer drift), spherical divergence correction, bandpass filtering, velocity analysis, 3-D DMO correction, stack and 3-D time migration. A detailed semblance velocity analysis was performed on the 12-fold data set for every second in-line and every 50th CMP, i.e. on a total of 600 spectra. According to this velocity analysis, interval velocities range from 1450-1650 m/s for the unconsolidated sediments and from 1650-3000 m/s for the consolidated sediments. Delineation of several horizons and fault surfaces reveal the potential for small-scale geologic and tectonic interpretation in three dimensions. Five major seismic facies and their detailed 3-D geometries can be distinguished in vertical and horizontal sections: lacustrine sediments (Holocene) , glaciolacustrine sediments (Pleistocene), Plateau Molasse, Subalpine Molasse and its thrust fault zone. Dips of beds within Plateau and Subalpine Molasse are ~8° and ~20°, respectively. Within the fault zone, many highly deformed structures with dips around 30° are visible. Preliminary tests with 3-D preserved-amplitude prestack depth migration demonstrate that the excellent data quality of Survey II allows application of such sophisticated techniques even to high-resolution seismic surveys. In general, the adaptation of the 3-D marine seismic reflection method, which to date has almost exclusively been used by the oil exploration industry, to a smaller geographical as well as financial scale has helped pave the way for applying this technique to environmental and engineering purposes.<br/><br/>La sismique réflexion est une méthode d?investigation du sous-sol avec un très grand pouvoir de résolution. Elle consiste à envoyer des vibrations dans le sol et à recueillir les ondes qui se réfléchissent sur les discontinuités géologiques à différentes profondeurs et remontent ensuite à la surface où elles sont enregistrées. Les signaux ainsi recueillis donnent non seulement des informations sur la nature des couches en présence et leur géométrie, mais ils permettent aussi de faire une interprétation géologique du sous-sol. Par exemple, dans le cas de roches sédimentaires, les profils de sismique réflexion permettent de déterminer leur mode de dépôt, leurs éventuelles déformations ou cassures et donc leur histoire tectonique. La sismique réflexion est la méthode principale de l?exploration pétrolière. Pendant longtemps on a réalisé des profils de sismique réflexion le long de profils qui fournissent une image du sous-sol en deux dimensions. Les images ainsi obtenues ne sont que partiellement exactes, puisqu?elles ne tiennent pas compte de l?aspect tridimensionnel des structures géologiques. Depuis quelques dizaines d?années, la sismique en trois dimensions (3-D) a apporté un souffle nouveau à l?étude du sous-sol. Si elle est aujourd?hui parfaitement maîtrisée pour l?imagerie des grandes structures géologiques tant dans le domaine terrestre que le domaine océanique, son adaptation à l?échelle lacustre ou fluviale n?a encore fait l?objet que de rares études. Ce travail de thèse a consisté à développer un système d?acquisition sismique similaire à celui utilisé pour la prospection pétrolière en mer, mais adapté aux lacs. Il est donc de dimension moindre, de mise en oeuvre plus légère et surtout d?une résolution des images finales beaucoup plus élevée. Alors que l?industrie pétrolière se limite souvent à une résolution de l?ordre de la dizaine de mètres, l?instrument qui a été mis au point dans le cadre de ce travail permet de voir des détails de l?ordre du mètre. Le nouveau système repose sur la possibilité d?enregistrer simultanément les réflexions sismiques sur trois câbles sismiques (ou flûtes) de 24 traces chacun. Pour obtenir des données 3-D, il est essentiel de positionner les instruments sur l?eau (source et récepteurs des ondes sismiques) avec une grande précision. Un logiciel a été spécialement développé pour le contrôle de la navigation et le déclenchement des tirs de la source sismique en utilisant des récepteurs GPS différentiel (dGPS) sur le bateau et à l?extrémité de chaque flûte. Ceci permet de positionner les instruments avec une précision de l?ordre de 20 cm. Pour tester notre système, nous avons choisi une zone sur le Lac Léman, près de la ville de Lausanne, où passe la faille de « La Paudèze » qui sépare les unités de la Molasse du Plateau et de la Molasse Subalpine. Deux campagnes de mesures de sismique 3-D y ont été réalisées sur une zone d?environ 1 km2. Les enregistrements sismiques ont ensuite été traités pour les transformer en images interprétables. Nous avons appliqué une séquence de traitement 3-D spécialement adaptée à nos données, notamment en ce qui concerne le positionnement. Après traitement, les données font apparaître différents faciès sismiques principaux correspondant notamment aux sédiments lacustres (Holocène), aux sédiments glacio-lacustres (Pléistocène), à la Molasse du Plateau, à la Molasse Subalpine de la zone de faille et la Molasse Subalpine au sud de cette zone. La géométrie 3-D détaillée des failles est visible sur les sections sismiques verticales et horizontales. L?excellente qualité des données et l?interprétation de plusieurs horizons et surfaces de faille montrent le potentiel de cette technique pour les investigations à petite échelle en trois dimensions ce qui ouvre des voies à son application dans les domaines de l?environnement et du génie civil.
Resumo:
Résumé: L'automatisation du séquençage et de l'annotation des génomes, ainsi que l'application à large échelle de méthodes de mesure de l'expression génique, génèrent une quantité phénoménale de données pour des organismes modèles tels que l'homme ou la souris. Dans ce déluge de données, il devient très difficile d'obtenir des informations spécifiques à un organisme ou à un gène, et une telle recherche aboutit fréquemment à des réponses fragmentées, voir incomplètes. La création d'une base de données capable de gérer et d'intégrer aussi bien les données génomiques que les données transcriptomiques peut grandement améliorer la vitesse de recherche ainsi que la qualité des résultats obtenus, en permettant une comparaison directe de mesures d'expression des gènes provenant d'expériences réalisées grâce à des techniques différentes. L'objectif principal de ce projet, appelé CleanEx, est de fournir un accès direct aux données d'expression publiques par le biais de noms de gènes officiels, et de représenter des données d'expression produites selon des protocoles différents de manière à faciliter une analyse générale et une comparaison entre plusieurs jeux de données. Une mise à jour cohérente et régulière de la nomenclature des gènes est assurée en associant chaque expérience d'expression de gène à un identificateur permanent de la séquence-cible, donnant une description physique de la population d'ARN visée par l'expérience. Ces identificateurs sont ensuite associés à intervalles réguliers aux catalogues, en constante évolution, des gènes d'organismes modèles. Cette procédure automatique de traçage se fonde en partie sur des ressources externes d'information génomique, telles que UniGene et RefSeq. La partie centrale de CleanEx consiste en un index de gènes établi de manière hebdomadaire et qui contient les liens à toutes les données publiques d'expression déjà incorporées au système. En outre, la base de données des séquences-cible fournit un lien sur le gène correspondant ainsi qu'un contrôle de qualité de ce lien pour différents types de ressources expérimentales, telles que des clones ou des sondes Affymetrix. Le système de recherche en ligne de CleanEx offre un accès aux entrées individuelles ainsi qu'à des outils d'analyse croisée de jeux de donnnées. Ces outils se sont avérés très efficaces dans le cadre de la comparaison de l'expression de gènes, ainsi que, dans une certaine mesure, dans la détection d'une variation de cette expression liée au phénomène d'épissage alternatif. Les fichiers et les outils de CleanEx sont accessibles en ligne (http://www.cleanex.isb-sib.ch/). Abstract: The automatic genome sequencing and annotation, as well as the large-scale gene expression measurements methods, generate a massive amount of data for model organisms. Searching for genespecific or organism-specific information througout all the different databases has become a very difficult task, and often results in fragmented and unrelated answers. The generation of a database which will federate and integrate genomic and transcriptomic data together will greatly improve the search speed as well as the quality of the results by allowing a direct comparison of expression results obtained by different techniques. The main goal of this project, called the CleanEx database, is thus to provide access to public gene expression data via unique gene names and to represent heterogeneous expression data produced by different technologies in a way that facilitates joint analysis and crossdataset comparisons. A consistent and uptodate gene nomenclature is achieved by associating each single gene expression experiment with a permanent target identifier consisting of a physical description of the targeted RNA population or the hybridization reagent used. These targets are then mapped at regular intervals to the growing and evolving catalogues of genes from model organisms, such as human and mouse. The completely automatic mapping procedure relies partly on external genome information resources such as UniGene and RefSeq. The central part of CleanEx is a weekly built gene index containing crossreferences to all public expression data already incorporated into the system. In addition, the expression target database of CleanEx provides gene mapping and quality control information for various types of experimental resources, such as cDNA clones or Affymetrix probe sets. The Affymetrix mapping files are accessible as text files, for further use in external applications, and as individual entries, via the webbased interfaces . The CleanEx webbased query interfaces offer access to individual entries via text string searches or quantitative expression criteria, as well as crossdataset analysis tools, and crosschip gene comparison. These tools have proven to be very efficient in expression data comparison and even, to a certain extent, in detection of differentially expressed splice variants. The CleanEx flat files and tools are available online at: http://www.cleanex.isbsib. ch/.
Resumo:
Background: Wine Saccharomyces cerevisiae strains, adapted to anaerobic must fermentations, suffer oxidative stress when they are grown under aerobic conditions for biomass propagation in the industrial process of active dry yeast production. Oxidative metabolism of sugars favors high biomass yields but also causes increased oxidation damage of cell components. The overexpression of the TRX2 gene, coding for a thioredoxin, enhances oxidative stress resistance in a wine yeast strain model. The thioredoxin and also the glutathione/glutaredoxin system constitute the most important defense against oxidation. Trx2p is also involved in the regulation of Yap1p-driven transcriptional response against some reactive oxygen species. Results: Laboratory scale simulations of the industrial active dry biomass production process demonstrate that TRX2 overexpression increases the wine yeast final biomass yield and also its fermentative capacity both after the batch and fed-batch phases. Microvinifications carried out with the modified strain show a fast start phenotype derived from its enhanced fermentative capacity and also increased content of beneficial aroma compounds. The modified strain displays an increased transcriptional response of Yap1p regulated genes and other oxidative stress related genes. Activities of antioxidant enzymes like Sod1p, Sod2p and catalase are also enhanced. Consequently, diminished oxidation of lipids and proteins is observed in the modified strain, which can explain the improved performance of the thioredoxin overexpressing strain. Conclusions: We report several beneficial effects of overexpressing the thioredoxin gene TRX2 in a wine yeast strain. We show that this strain presents an enhanced redox defense. Increased yield of biomass production process in TRX2 overexpressing strain can be of special interest for several industrial applications.
Resumo:
Quality management has become a strategic issue for organisations and is very valuable to produce quality software. However, quality management systems (QMS) are not easy to implement and maintain. The authors' experience shows the benefits of developing a QMS by first formalising it using semantic web ontologies and then putting them into practice through a semantic wiki. The QMS ontology that has been developed captures the core concepts of a traditional QMS and combines them with concepts coming from the MPIu'a development process model, which is geared towards obtaining usable and accessible software products. Then, the ontology semantics is directly put into play by a semantics-aware tool, the Semantic MediaWiki. The developed QMS tool is being used for 2 years by the GRIHO research group, where it has manages almost 50 software development projects taking into account the quality management issues. It has also been externally audited by a quality certification organisation. Its users are very satisfied with their daily work with the tool, which manages all the documents created during project development and also allows them to collaborate, thanks to the wiki features.
Resumo:
To compare the cost and effectiveness of the levonorgestrel-releasing intrauterine system (LNG-IUS) versus combined oral contraception (COC) and progestogens (PROG) in first-line treatment of dysfunctional uterine bleeding (DUB) in Spain. STUDY DESIGN: A cost-effectiveness and cost-utility analysis of LNG-IUS, COC and PROG was carried out using a Markov model based on clinical data from the literature and expert opinion. The population studied were women with a previous diagnosis of idiopathic heavy menstrual bleeding. The analysis was performed from the National Health System perspective, discounting both costs and future effects at 3%. In addition, a sensitivity analysis (univariate and probabilistic) was conducted. RESULTS: The results show that the greater efficacy of LNG-IUS translates into a gain of 1.92 and 3.89 symptom-free months (SFM) after six months of treatment versus COC and PROG, respectively (which represents an increase of 33% and 60% of symptom-free time). Regarding costs, LNG-IUS produces savings of 174.2-309.95 and 230.54-577.61 versus COC and PROG, respectively, after 6 months-5 years. Apart from cost savings and gains in SFM, quality-adjusted life months (QALM) are also favourable to LNG-IUS in all scenarios, with a range of gains between 1 and 2 QALM compared to COC and PROG. CONCLUSIONS: The results indicate that first-line use of the LNG-IUS is the dominant therapeutic option (less costly and more effective) in comparison with first-line use of COC or PROG for the treatment of DUB in Spain. LNG-IUS as first line is also the option that provides greatest health-related quality of life to patients.
Resumo:
Diplomityössä tutkittiin kuuman pyrolyysihöyryn puhdistamista haisevista ja kevyistä haihtuvista yhdisteistä. Työn kirjallisuusosassa selvitettiin pyrolyysiöljyn kannattavuutta uusiutuvana energialähteenä. Lisäksi eri pesurityyppejä tarkasteltiin ja ja vertailtiin. Työn kokeellisessa osassa käytettiin kahta erilaista koelaitteistoa. Tuotteen talteenotossa vertailtiin reaktorilämpötilan ja raaka-aineen kosteuden vaikutusta pyrolyysisaantoihin. Komponenttien talteenotossa tutkittiin epästabiilien ja pistävän hajuisten yhdisteiden poistamista kuumasta pyrolyysihöyrystä. Raaka-aineena käytettiin kuusen metsätäh-dehaketta, joka sisältää runsaasti neulasia ja kaarnaa. Kokeet toteutettiin lämpötila-alueella 460 - 520 °C. Koelaitteistot koostuivat kaasun (N2) syöttöjärjestelmään kytketystä kuumasta ja kyl-mästä puolesta. Tuotteen talteenotossa kuuma pyrolyysihöyry jäähdytettiin ja otettiin talteen. Komponenttien talteenotossa tuote kerättiin suodattimelle ja metyleeniklo-ridiloukkuun. Tuotteiden koostumukset analysoitiin kaasukromatokrafilla. Korkeimmat orgaaniset saannot saatiin 480 °C reaktorilämpötilalla ja 8-9 p-% raaka-ainekosteudella. Pyrolyysiveden määrä putosi raaka-aineen kosteutta nostettaessa. Eri reaktorilämpötiloilla ja raaka-ainekosteuksilla ei ollut vaikutusta hiiltosaantoihin. Kaasusaannot (pääosin CO2, CO ja hiilivedyt) olivat noin 10 p-%. Komponenttien talteenotossa suodatin tukkeutui matalissa (< 250 °C) lämpötiloissa. Suodattimelle jäänyt materiaali oli pääosin neulasista ja kaarnasta peräisin olevia uuteaineita (pääosin hartsi- rasvahappoja) ja sokereita. Korkeimmissa lämpötiloissa (> 250 °C) uuteaineet läpäisivät suodattimen paremmin. 250 ja 300 °C:n lämpötiloissa suuri määrä lyhytketjuisia helposti haihtuvia epästabiileja ja haisevia yhdisteitä (ketoneja, furaani- ja furfuraalijohdannaisia jne.) jäi metyleenikloridi- ja metanoliloukkuihin.