909 resultados para Wine and wine making Analysis


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação apresentada na Faculdade de Ciências e Tecnologia da Universidade Nova de Lisboa para obtenção do grau de Mestre em Engenharia Informática

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In Portugal, especially starting in the 1970s, women’s studies had implications on the emergency of the concept of gender and the feminist criticism to the prevailing models about differences between sexes. Until then, women had been absent from scientific research both as subject and as object. Feminism brought more reflexivity to the scientific thinking. After the 25th of April 1974, because of the consequent political openness, several innovating themes of research emerged, together with new concepts and fields of study. However, as far as gender and science relationship is concerned, such studies especially concentrate on higher education institutions. The feminist thinking seems to have two main objectives: to give women visibility, on the one hand, and to denunciate men’s domain in the several fields of knowledge. In 1977, the “Feminine Commission” is created and since then it has been publishing studies on women’s condition and contributing to the enhancement of the reflection of female condition at all levels. In the 1980s, the growing feminisation of tertiary education (both of students and academics), favoured the development of women’s studies, especially on their condition within universities with a special focus on the glass ceiling, despite the lack of statistical data by gender, thus making difficult the analysis of women integration in several sectors, namely in educational and scientific research activities. Other agglutinating themes are family, social and legal condition, work, education, and feminine intervention on political and social movements. In the 1990s, Women Studies are institutionalised in the academic context with the creation of the first Master in Women Studies in the Universidade Aberta (Open University), in Lisbon. In 1999, the first Portuguese journal of women studies is created – “Faces de Eva”. Seminars, conferences, thesis, journals, and projects on women’s studies are more and more common. However, results and publications are not so divulgated as they should be, because of lack of comprehensive and coordinated databases. 2. Analysis by topics 2.1. Horizontal and vertical segregation Research questions It is one of the main areas of research in Portugal. Essentially two issues have been considered: - The analysis of vertical gender segregation in educational and professional fields, having reflexes on women professional career progression with special attention to men’s power in control positions and the glass ceiling. - The analysis of horizontal segregation, special in higher education (teaching and research) where women have less visibility than men, and the under-representation of women in technology and technological careers. Research in this area mainly focuses on description, showing the under-representation of women in certain scientific areas and senior positions. Nevertheless, the studies that analyze horizontal segregation in the field of education adopt a more analytical approach which focuses on the analysis of the mechanisms of reproduction of gender stereotypes, especially socialisation, influencing educational and career choices. 1

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Dissertação para obtenção do Grau de Mestre em Engenharia Electrotécnica e de Computadores

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Tese de Doutoramento em Biologia de Plantas

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The aim of this work is to evaluate the capabilities and limitations of chemometric methods and other mathematical treatments applied on spectroscopic data and more specifically on paint samples. The uniqueness of the spectroscopic data comes from the fact that they are multivariate - a few thousands variables - and highly correlated. Statistical methods are used to study and discriminate samples. A collection of 34 red paint samples was measured by Infrared and Raman spectroscopy. Data pretreatment and variable selection demonstrated that the use of Standard Normal Variate (SNV), together with removal of the noisy variables by a selection of the wavelengths from 650 to 1830 cm−1 and 2730-3600 cm−1, provided the optimal results for infrared analysis. Principal component analysis (PCA) and hierarchical clusters analysis (HCA) were then used as exploratory techniques to provide evidence of structure in the data, cluster, or detect outliers. With the FTIR spectra, the Principal Components (PCs) correspond to binder types and the presence/absence of calcium carbonate. 83% of the total variance is explained by the four first PCs. As for the Raman spectra, we observe six different clusters corresponding to the different pigment compositions when plotting the first two PCs, which account for 37% and 20% respectively of the total variance. In conclusion, the use of chemometrics for the forensic analysis of paints provides a valuable tool for objective decision-making, a reduction of the possible classification errors, and a better efficiency, having robust results with time saving data treatments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Shared Decision Making (SDM) is increasingly advocated as a model for medical decision making. However, there is still low use of SDM in clinical practice. High impact factor journals might represent an efficient way for its dissemination. We aimed to identify and characterize publication trends of SDM in 15 high impact medical journals. METHODS: We selected the 15 general and internal medicine journals with the highest impact factor publishing original articles, letters and editorials. We retrieved publications from 1996 to 2011 through the full-text search function on each journal website and abstracted bibliometric data. We included publications of any type containing the phrase "shared decision making" or five other variants in their abstract or full text. These were referred to as SDM publications. A polynomial Poisson regression model with logarithmic link function was used to assess the evolution across the period of the number of SDM publications according to publication characteristics. RESULTS: We identified 1285 SDM publications out of 229,179 publications in 15 journals from 1996 to 2011. The absolute number of SDM publications by journal ranged from 2 to 273 over 16 years. SDM publications increased both in absolute and relative numbers per year, from 46 (0.32% relative to all publications from the 15 journals) in 1996 to 165 (1.17%) in 2011. This growth was exponential (P < 0.01). We found fewer research publications (465, 36.2% of all SDM publications) than non-research publications, which included non-systematic reviews, letters, and editorials. The increase of research publications across time was linear. Full-text search retrieved ten times more SDM publications than a similar PubMed search (1285 vs. 119 respectively). CONCLUSION: This review in full-text showed that SDM publications increased exponentially in major medical journals from 1996 to 2011. This growth might reflect an increased dissemination of the SDM concept to the medical community.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The quadrennial need study was developed to assist in identifying county highway financial needs (construction, rehabilitation, maintenance, and administration) and in the distribution of the road use tax fund (RUTF) among the counties in the state. During the period since the need study was first conducted using HWYNEEDS software, between 1982 and 1998, there have been large fluctuations in the level of funds distributed to individual counties. A recent study performed by Jim Cable (HR-363, 1993), found that one of the major factors affecting the volatility in the level of fluctuations is the quality of the pavement condition data collected and the accuracy of these data. In 1998, the Center for Transportation Research and Education researchers (Maze and Smadi) completed a project to study the feasibility of using automated pavement condition data collected for the Iowa Pavement Management Program (IPMP) for the paved county roads to be used in the HWYNEEDS software (TR-418). The automated condition data are objective and also more current since they are collected in a two year cycle compared to the 10-year cycle used by HWYNEEDS right now. The study proved the use of the automated condition data in HWYNEEDS would be feasible and beneficial in educing fluctuations when applied to a pilot study area. In another recommendation from TR-418, the researchers recommended a full analysis and investigation of HWYNEEDS methodology and parameters (for more information on the project, please review the TR-418 project report). The study reported in this document builds on the previous study on using the automated condition data in HWYNEEDS and covers the analysis and investigation of the HWYNEEDS computer program methodology and parameters. The underlying hypothesis for this study is thatalong with the IPMP automated condition data, some changes need to be made to HWYNEEDS parameters to accommodate the use of the new data, which will stabilize the process of allocating resources and reduce fluctuations from one quadrennial need study to another. Another objective of this research is to investigate the gravel roads needs and study the feasibility of developing a more objective approach to determining needs on the counties gravel road network. This study identifies new procedures by which the HWYNEEDS computer program is used to conduct the quadrennial needs study on paved roads. Also, a new procedure will be developed to determine gravel roads needs outside of the HWYNEED program. Recommendations are identified for the new procedures and also in terms of making changes to the current quadrennial need study. Future research areas are also identified.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Summary Detection, analysis and monitoring of slope movements by high-resolution digital elevation modelsSlope movements, such as rockfalls, rockslides, shallow landslides or debris flows, are frequent in many mountainous areas. These natural hazards endanger the inhabitants and infrastructures making it necessary to assess the hazard and risk caused by these phenomena. This PhD thesis explores various approaches using digital elevation models (DEMs) - and particularly high-resolution DEMs created by aerial or terrestrial laser scanning (TLS) - that contribute to the assessment of slope movement hazard at regional and local scales.The regional detection of areas prone to rockfalls and large rockslides uses different morphologic criteria or geometric instability factors derived from DEMs, i.e. the steepness of the slope, the presence of discontinuities, which enable a sliding mechanism, and the denudation potential. The combination of these factors leads to a map of susceptibility to rockfall initiation that is in good agreement with field studies as shown with the example of the Little Mill Campground area (Utah, USA). Another case study in the Illgraben catchment in the Swiss Alps highlighted the link between areas with a high denudation potential and actual rockfall areas.Techniques for a detailed analysis and characterization of slope movements based on high-resolution DEMs have been developed for specific, localized sites, i.e. ancient slide scars, present active instabilities or potential slope instabilities. The analysis of the site's characteristics mainly focuses on rock slopes and includes structural analyses (orientation of discontinuities); estimation of spacing, persistence and roughness of discontinuities; failure mechanisms based on the structural setting; and volume calculations. For the volume estimation a new 3D approach was tested to reconstruct the topography before a landslide or to construct the basal failure surface of an active or potential instability. The rockslides at Åknes, Tafjord and Rundefjellet in western Norway were principally used as study sites to develop and test the different techniques.The monitoring of slope instabilities investigated in this PhD thesis is essentially based on multitemporal (or sequential) high-resolution DEMs, in particular sequential point clouds acquired by TLS. The changes in the topography due to slope movements can be detected and quantified by sequential TLS datasets, notably by shortest distance comparisons revealing the 3D slope movements over the entire region of interest. A detailed analysis of rock slope movements is based on the affine transformation between an initial and a final state of the rock mass and its decomposition into translational and rotational movements. Monitoring using TLS was very successful on the fast-moving Eiger rockslide in the Swiss Alps, but also on the active rockslides of Åknes and Nordnesfjellet (northern Norway). One of the main achievements on the Eiger and Aknes rockslides is to combine the site's morphology and structural setting with the measured slope movements to produce coherent instability models. Both case studies also highlighted a strong control of the structures in the rock mass on the sliding directions. TLS was also used to monitor slope movements in soils, such as landslides in sensitive clays in Québec (Canada), shallow landslides on river banks (Sorge River, Switzerland) and a debris flow channel (Illgraben).The PhD thesis underlines the broad uses of high-resolution DEMs and especially of TLS in the detection, analysis and monitoring of slope movements. Future studies should explore in more depth the different techniques and approaches developed and used in this PhD, improve them and better integrate the findings in current hazard assessment practices and in slope stability models.Résumé Détection, analyse et surveillance de mouvements de versant à l'aide de modèles numériques de terrain de haute résolutionDes mouvements de versant, tels que des chutes de blocs, glissements de terrain ou laves torrentielles, sont fréquents dans des régions montagneuses et mettent en danger les habitants et les infrastructures ce qui rend nécessaire d'évaluer le danger et le risque causé par ces phénomènes naturels. Ce travail de thèse explore diverses approches qui utilisent des modèles numériques de terrain (MNT) et surtout des MNT de haute résolution créés par scanner laser terrestre (SLT) ou aérien - et qui contribuent à l'évaluation du danger de mouvements de versant à l'échelle régionale et locale.La détection régionale de zones propices aux chutes de blocs ou aux éboulements utilise plusieurs critères morphologiques dérivés d'un MNT, tels que la pente, la présence de discontinuités qui permettent un mécanisme de glissement ou le potentiel de dénudation. La combinaison de ces facteurs d'instabilité mène vers une carte de susceptibilité aux chutes de blocs qui est en accord avec des travaux de terrain comme démontré avec l'exemple du Little Mill Campground (Utah, États-Unis). Un autre cas d'étude - l'Illgraben dans les Alpes valaisannes - a mis en évidence le lien entre les zones à fort potentiel de dénudation et les sources effectives de chutes de blocs et d'éboulements.Des techniques pour l'analyse et la caractérisation détaillée de mouvements de versant basées sur des MNT de haute résolution ont été développées pour des sites spécifiques et localisés, comme par exemple des cicatrices d'anciens éboulements et des instabilités actives ou potentielles. Cette analyse se focalise principalement sur des pentes rocheuses et comprend l'analyse structurale (orientation des discontinuités); l'estimation de l'espacement, la persistance et la rugosité des discontinuités; l'établissement des mécanismes de rupture; et le calcul de volumes. Pour cela une nouvelle approche a été testée en rétablissant la topographie antérieure au glissement ou en construisant la surface de rupture d'instabilités actuelles ou potentielles. Les glissements rocheux d'Åknes, Tafjord et Rundefjellet en Norvège ont été surtout utilisés comme cas d'étude pour développer et tester les diverses approches. La surveillance d'instabilités de versant effectuée dans cette thèse de doctorat est essentiellement basée sur des MNT de haute résolution multi-temporels (ou séquentiels), en particulier des nuages de points séquentiels acquis par SLT. Les changements topographiques dus aux mouvements de versant peuvent être détectés et quantifiés sur l'ensemble d'un glissement, notamment par comparaisons des distances les plus courtes entre deux nuages de points. L'analyse détaillée des mouvements est basée sur la transformation affine entre la position initiale et finale d'un bloc et sa décomposition en mouvements translationnels et rotationnels. La surveillance par SLT a démontré son potentiel avec l'effondrement d'un pan de l'Eiger dans les Alpes suisses, mais aussi aux glissements rocheux d'Aknes et Nordnesfjellet en Norvège. Une des principales avancées à l'Eiger et à Aknes est la création de modèles d'instabilité cohérents en combinant la morphologie et l'agencement structural des sites avec les mesures de déplacements. Ces deux cas d'étude ont aussi démontré le fort contrôle des structures existantes dans le massif rocheux sur les directions de glissement. Le SLT a également été utilisé pour surveiller des glissements dans des terrains meubles comme dans les argiles sensibles au Québec (Canada), sur les berges de la rivière Sorge en Suisse et dans le chenal à laves torrentielles de l'Illgraben.Cette thèse de doctorat souligne le vaste champ d'applications des MNT de haute résolution et particulièrement du SLT dans la détection, l'analyse et la surveillance des mouvements de versant. Des études futures devraient explorer plus en profondeur les différentes techniques et approches développées, les améliorer et mieux les intégrer dans des pratiques actuelles d'analyse de danger et surtout dans la modélisation de stabilité des versants.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Since 1987, the Iowa Department of Transportation has based control of hot asphalt concrete mixes on cold feed gradations. This report presents results of comparisons between cold feed gradations and gradations of aggregate from the same material after it has been processed through the plant and laydown machine. Results are categorized based on mix type, plant type, and method of dust control, in an effort to quantify and identify the factors contributing to those changes. Results of the report are: 1. From the 390 sample comparisons made, aggregate degradation due to asphalt plant processing was demonstrated by an average increase of +0.7% passing the #200 sieve and an average increase in surface area of +1.8 sq. ft. per pound of aggregate. 2. Categories with Type A Mix or Recycling as a sorting criteria generally produced greater degradation than categories containing Type B Mixes and/or plants with scrubbers. 3. None of the averages calculated for the categories should be considered unacceptably high, however, it is information that should be considered when making mix changes in the field, selecting asphalt contents for borderline mix designs, or when evaluating potential mix gradation specification or design criteria changes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

ABSTRACT: BACKGROUND: Decision curve analysis has been introduced as a method to evaluate prediction models in terms of their clinical consequences if used for a binary classification of subjects into a group who should and into a group who should not be treated. The key concept for this type of evaluation is the "net benefit", a concept borrowed from utility theory. METHODS: We recall the foundations of decision curve analysis and discuss some new aspects. First, we stress the formal distinction between the net benefit for the treated and for the untreated and define the concept of the "overall net benefit". Next, we revisit the important distinction between the concept of accuracy, as typically assessed using the Youden index and a receiver operating characteristic (ROC) analysis, and the concept of utility of a prediction model, as assessed using decision curve analysis. Finally, we provide an explicit implementation of decision curve analysis to be applied in the context of case-control studies. RESULTS: We show that the overall net benefit, which combines the net benefit for the treated and the untreated, is a natural alternative to the benefit achieved by a model, being invariant with respect to the coding of the outcome, and conveying a more comprehensive picture of the situation. Further, within the framework of decision curve analysis, we illustrate the important difference between the accuracy and the utility of a model, demonstrating how poor an accurate model may be in terms of its net benefit. Eventually, we expose that the application of decision curve analysis to case-control studies, where an accurate estimate of the true prevalence of a disease cannot be obtained from the data, is achieved with a few modifications to the original calculation procedure. CONCLUSIONS: We present several interrelated extensions to decision curve analysis that will both facilitate its interpretation and broaden its potential area of application.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AbstractAlthough the genomes from any two human individuals are more than 99.99% identical at the sequence level, some structural variation can be observed. Differences between genomes include single nucleotide polymorphism (SNP), inversion and copy number changes (gain or loss of DNA). The latter can range from submicroscopic events (CNVs, at least 1kb in size) to complete chromosomal aneuploidies. Small copy number variations have often no (lethal) consequences to the cell, but a few were associated to disease susceptibility and phenotypic variations. Larger re-arrangements (i.e. complete chromosome gain) are frequently associated with more severe consequences on health such as genomic disorders and cancer. High-throughput technologies like DNA microarrays enable the detection of CNVs in a genome-wide fashion. Since the initial catalogue of CNVs in the human genome in 2006, there has been tremendous interest in CNVs both in the context of population and medical genetics. Understanding CNV patterns within and between human populations is essential to elucidate their possible contribution to disease. But genome analysis is a challenging task; the technology evolves rapidly creating needs for novel, efficient and robust analytical tools which need to be compared with existing ones. Also, while the link between CNV and disease has been established, the relative CNV contribution is not fully understood and the predisposition to disease from CNVs of the general population has not been yet investigated.During my PhD thesis, I worked on several aspects related to CNVs. As l will report in chapter 3, ! was interested in computational methods to detect CNVs from the general population. I had access to the CoLaus dataset, a population-based study with more than 6,000 participants from the Lausanne area. All these individuals were analysed on SNP arrays and extensive clinical information were available. My work explored existing CNV detection methods and I developed a variety of metrics to compare their performance. Since these methods were not producing entirely satisfactory results, I implemented my own method which outperformed two existing methods. I also devised strategies to combine CNVs from different individuals into CNV regions.I was also interested in the clinical impact of CNVs in common disease (chapter 4). Through an international collaboration led by the Centre Hospitalier Universitaire Vaudois (CHUV) and the Imperial College London I was involved as a main data analyst in the investigation of a rare deletion at chromosome 16p11 detected in obese patients. Specifically, we compared 8,456 obese patients and 11,856 individuals from the general population and we found that the deletion was accounting for 0.7% of the morbid obesity cases and was absent in healthy non- obese controls. This highlights the importance of rare variants with strong impact and provides new insights in the design of clinical studies to identify the missing heritability in common disease.Furthermore, I was interested in the detection of somatic copy number alterations (SCNA) and their consequences in cancer (chapter 5). This project was a collaboration initiated by the Ludwig Institute for Cancer Research and involved other groups from the Swiss Institute of Bioinformatics, the CHUV and Universities of Lausanne and Geneva. The focus of my work was to identify genes with altered expression levels within somatic copy number alterations (SCNA) in seven metastatic melanoma ceil lines, using CGH and SNP arrays, RNA-seq, and karyotyping. Very few SCNA genes were shared by even two melanoma samples making it difficult to draw any conclusions at the individual gene level. To overcome this limitation, I used a network-guided analysis to determine whether any pathways, defined by amplified or deleted genes, were common among the samples. Six of the melanoma samples were potentially altered in four pathways and five samples harboured copy-number and expression changes in components of six pathways. In total, this approach identified 28 pathways. Validation with two external, large melanoma datasets confirmed all but three of the detected pathways and demonstrated the utility of network-guided approaches for both large and small datasets analysis.RésuméBien que le génome de deux individus soit similaire à plus de 99.99%, des différences de structure peuvent être observées. Ces différences incluent les polymorphismes simples de nucléotides, les inversions et les changements en nombre de copies (gain ou perte d'ADN). Ces derniers varient de petits événements dits sous-microscopiques (moins de 1kb en taille), appelés CNVs (copy number variants) jusqu'à des événements plus large pouvant affecter des chromosomes entiers. Les petites variations sont généralement sans conséquence pour la cellule, toutefois certaines ont été impliquées dans la prédisposition à certaines maladies, et à des variations phénotypiques dans la population générale. Les réarrangements plus grands (par exemple, une copie additionnelle d'un chromosome appelée communément trisomie) ont des répercutions plus grave pour la santé, comme par exemple dans certains syndromes génomiques et dans le cancer. Les technologies à haut-débit telle les puces à ADN permettent la détection de CNVs à l'échelle du génome humain. La cartographie en 2006 des CNV du génome humain, a suscité un fort intérêt en génétique des populations et en génétique médicale. La détection de différences au sein et entre plusieurs populations est un élément clef pour élucider la contribution possible des CNVs dans les maladies. Toutefois l'analyse du génome reste une tâche difficile, la technologie évolue très rapidement créant de nouveaux besoins pour le développement d'outils, l'amélioration des précédents, et la comparaison des différentes méthodes. De plus, si le lien entre CNV et maladie a été établit, leur contribution précise n'est pas encore comprise. De même que les études sur la prédisposition aux maladies par des CNVs détectés dans la population générale n'ont pas encore été réalisées.Pendant mon doctorat, je me suis concentré sur trois axes principaux ayant attrait aux CNV. Dans le chapitre 3, je détaille mes travaux sur les méthodes d'analyses des puces à ADN. J'ai eu accès aux données du projet CoLaus, une étude de la population de Lausanne. Dans cette étude, le génome de plus de 6000 individus a été analysé avec des puces SNP et de nombreuses informations cliniques ont été récoltées. Pendant mes travaux, j'ai utilisé et comparé plusieurs méthodes de détection des CNVs. Les résultats n'étant pas complètement satisfaisant, j'ai implémenté ma propre méthode qui donne de meilleures performances que deux des trois autres méthodes utilisées. Je me suis aussi intéressé aux stratégies pour combiner les CNVs de différents individus en régions.Je me suis aussi intéressé à l'impact clinique des CNVs dans le cas des maladies génétiques communes (chapitre 4). Ce projet fut possible grâce à une étroite collaboration avec le Centre Hospitalier Universitaire Vaudois (CHUV) et l'Impérial College à Londres. Dans ce projet, j'ai été l'un des analystes principaux et j'ai travaillé sur l'impact clinique d'une délétion rare du chromosome 16p11 présente chez des patients atteints d'obésité. Dans cette collaboration multidisciplinaire, nous avons comparés 8'456 patients atteint d'obésité et 11 '856 individus de la population générale. Nous avons trouvés que la délétion était impliquée dans 0.7% des cas d'obésité morbide et était absente chez les contrôles sains (non-atteint d'obésité). Notre étude illustre l'importance des CNVs rares qui peuvent avoir un impact clinique très important. De plus, ceci permet d'envisager une alternative aux études d'associations pour améliorer notre compréhension de l'étiologie des maladies génétiques communes.Egalement, j'ai travaillé sur la détection d'altérations somatiques en nombres de copies (SCNA) et de leurs conséquences pour le cancer (chapitre 5). Ce projet fut une collaboration initiée par l'Institut Ludwig de Recherche contre le Cancer et impliquant l'Institut Suisse de Bioinformatique, le CHUV et les Universités de Lausanne et Genève. Je me suis concentré sur l'identification de gènes affectés par des SCNAs et avec une sur- ou sous-expression dans des lignées cellulaires dérivées de mélanomes métastatiques. Les données utilisées ont été générées par des puces ADN (CGH et SNP) et du séquençage à haut débit du transcriptome. Mes recherches ont montrées que peu de gènes sont récurrents entre les mélanomes, ce qui rend difficile l'interprétation des résultats. Pour contourner ces limitations, j'ai utilisé une analyse de réseaux pour définir si des réseaux de signalisations enrichis en gènes amplifiés ou perdus, étaient communs aux différents échantillons. En fait, parmi les 28 réseaux détectés, quatre réseaux sont potentiellement dérégulés chez six mélanomes, et six réseaux supplémentaires sont affectés chez cinq mélanomes. La validation de ces résultats avec deux larges jeux de données publiques, a confirmée tous ces réseaux sauf trois. Ceci démontre l'utilité de cette approche pour l'analyse de petits et de larges jeux de données.Résumé grand publicL'avènement de la biologie moléculaire, en particulier ces dix dernières années, a révolutionné la recherche en génétique médicale. Grâce à la disponibilité du génome humain de référence dès 2001, de nouvelles technologies telles que les puces à ADN sont apparues et ont permis d'étudier le génome dans son ensemble avec une résolution dite sous-microscopique jusque-là impossible par les techniques traditionnelles de cytogénétique. Un des exemples les plus importants est l'étude des variations structurales du génome, en particulier l'étude du nombre de copies des gènes. Il était établi dès 1959 avec l'identification de la trisomie 21 par le professeur Jérôme Lejeune que le gain d'un chromosome supplémentaire était à l'origine de syndrome génétique avec des répercussions graves pour la santé du patient. Ces observations ont également été réalisées en oncologie sur les cellules cancéreuses qui accumulent fréquemment des aberrations en nombre de copies (telles que la perte ou le gain d'un ou plusieurs chromosomes). Dès 2004, plusieurs groupes de recherches ont répertorié des changements en nombre de copies dans des individus provenant de la population générale (c'est-à-dire sans symptômes cliniques visibles). En 2006, le Dr. Richard Redon a établi la première carte de variation en nombre de copies dans la population générale. Ces découvertes ont démontrées que les variations dans le génome était fréquentes et que la plupart d'entre elles étaient bénignes, c'est-à-dire sans conséquence clinique pour la santé de l'individu. Ceci a suscité un très grand intérêt pour comprendre les variations naturelles entre individus mais aussi pour mieux appréhender la prédisposition génétique à certaines maladies.Lors de ma thèse, j'ai développé de nouveaux outils informatiques pour l'analyse de puces à ADN dans le but de cartographier ces variations à l'échelle génomique. J'ai utilisé ces outils pour établir les variations dans la population suisse et je me suis consacré par la suite à l'étude de facteurs pouvant expliquer la prédisposition aux maladies telles que l'obésité. Cette étude en collaboration avec le Centre Hospitalier Universitaire Vaudois a permis l'identification d'une délétion sur le chromosome 16 expliquant 0.7% des cas d'obésité morbide. Cette étude a plusieurs répercussions. Tout d'abord elle permet d'effectuer le diagnostique chez les enfants à naître afin de déterminer leur prédisposition à l'obésité. Ensuite ce locus implique une vingtaine de gènes. Ceci permet de formuler de nouvelles hypothèses de travail et d'orienter la recherche afin d'améliorer notre compréhension de la maladie et l'espoir de découvrir un nouveau traitement Enfin notre étude fournit une alternative aux études d'association génétique qui n'ont eu jusqu'à présent qu'un succès mitigé.Dans la dernière partie de ma thèse, je me suis intéressé à l'analyse des aberrations en nombre de copies dans le cancer. Mon choix s'est porté sur l'étude de mélanomes, impliqués dans le cancer de la peau. Le mélanome est une tumeur très agressive, elle est responsable de 80% des décès des cancers de la peau et est souvent résistante aux traitements utilisés en oncologie (chimiothérapie, radiothérapie). Dans le cadre d'une collaboration entre l'Institut Ludwig de Recherche contre le Cancer, l'Institut Suisse de Bioinformatique, le CHUV et les universités de Lausanne et Genève, nous avons séquencés l'exome (les gènes) et le transcriptome (l'expression des gènes) de sept mélanomes métastatiques, effectués des analyses du nombre de copies par des puces à ADN et des caryotypes. Mes travaux ont permis le développement de nouvelles méthodes d'analyses adaptées au cancer, d'établir la liste des réseaux de signalisation cellulaire affectés de façon récurrente chez le mélanome et d'identifier deux cibles thérapeutiques potentielles jusqu'alors ignorées dans les cancers de la peau.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Analysing the observed differences for incidence or mortality of a particular disease between two different situations (such as time points, geographical areas, gender or other social characteristics) can be useful both for scientific or administrative purposes. From an epidemiological and public health point of view, it is of great interest to assess the effect of demographic factors in these observed differences in order to elucidate the effect of the risk of developing a disease or dying from it. The method proposed by Bashir and Estève, which splits the observed variation into three components: risk, population structure and population size is a common choice at practice. Results A web-based application, called RiskDiff has been implemented (available at http://rht.iconcologia.net/riskdiff.htm webcite), to perform this kind of statistical analyses, providing text and graphical summaries. Code from the implemented functions in R is also provided. An application to cancer mortality data from Catalonia is used for illustration. Conclusions Combining epidemiological with demographical factors is crucial for analysing incidence or mortality from a disease, especially if the population pyramids show substantial differences. The tool implemented may serve to promote and divulgate the use of this method to give advice for epidemiologic interpretation and decision making in public health.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Occupational exposure modeling is widely used in the context of the E.U. regulation on the registration, evaluation, authorization, and restriction of chemicals (REACH). First tier tools, such as European Centre for Ecotoxicology and TOxicology of Chemicals (ECETOC) targeted risk assessment (TRA) or Stoffenmanager, are used to screen a wide range of substances. Those of concern are investigated further using second tier tools, e.g., Advanced REACH Tool (ART). Local sensitivity analysis (SA) methods are used here to determine dominant factors for three models commonly used within the REACH framework: ECETOC TRA v3, Stoffenmanager 4.5, and ART 1.5. Based on the results of the SA, the robustness of the models is assessed. For ECETOC, the process category (PROC) is the most important factor. A failure to identify the correct PROC has severe consequences for the exposure estimate. Stoffenmanager is the most balanced model and decision making uncertainties in one modifying factor are less severe in Stoffenmanager. ART requires a careful evaluation of the decisions in the source compartment since it constitutes ∼75% of the total exposure range, which corresponds to an exposure estimate of 20-22 orders of magnitude. Our results indicate that there is a trade off between accuracy and precision of the models. Previous studies suggested that ART may lead to more accurate results in well-documented exposure situations. However, the choice of the adequate model should ultimately be determined by the quality of the available exposure data: if the practitioner is uncertain concerning two or more decisions in the entry parameters, Stoffenmanager may be more robust than ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The thesis studies the representations of different elements of contemporary work as present in Knowledge Management (KM). KM is approached as management discourse that is seen to affect and influence managerial practices in organizations. As representatives of KM discourse four journal articles are analyzed, using the methodology of Critical Discourse Analysis and the framework of Critical Management Studies, with a special emphasis on the question of structure and agency. The results of the analysis reveal that structural elements such as information technology and organizational structures are strongly present in the most influential KM representations, making their improvement also a desirable course of action for managers. In contrast agentic properties are not in a central role, they are subjugated to structural constraints of varying kind and degree. The thesis claims that one such constraint is KM discourse itself, influencing managerial and organizational choices and decision making. The thesis concludes that the way human beings are represented, studied and treated in management studies such as KM needs to be re-examined. Pro gradu-tutkielmassa analysoidaan työhön ja sen tekijään liittyviä representaatioita Tietojohtamisen kirjallisuudessa. Tietojohtamista tarkastellaan liikkeenjohdollisena diskurssina, jolla nähdään olevan vaikutus organisaatioiden päätöksentekoon ja toimintaan. Tutkielmassa analysoidaan neljä Tietojohtamisen tieteellistä artikkelia, käyttäen metodina kriittistä diskurssianalyysiä. Tutkielman viitekehyksenä on kriittinen liikkeenjohdon tutkimus. Lisäksi työssä pohditaan kysymystä rakenteen ja toimijan välisestä vuorovaikutuksesta. Tutkielman analyysi paljastaa, että tietojohtamisen vaikutusvaltaisimmat representaatiot painottavat rakenteellisia tekijöitä, kuten informaatioteknologiaa ja organisaatiorakenteita. Tämän seurauksena mm. panostukset em. tekijöihin nähdään organisaatioissa toivottavana toimintana. Vastaavasti representaatiot jotka painottavat yksilöitä ja toimintaa ovat em. tekijöille alisteisessa asemassa. Tapaa, jolla yksilöitä kuvataan ja käsitellään Tietojohtamisen diskurssissa, tulisikin laajentaa ja monipuolistaa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this book, I apply a philosophical approach to study the precautionary principle in environmental (and health) risk decision-making. The principle says that unacceptable environmental and health risks should be anticipated, and they ought to be forestalled before the damage comes to fruition even if scientific understanding of the risks is inadequate. The study consists of introductory chapters, summary and seven original publications which aim at explicating the principle, critically analysing the debate on the principle, and constructing a basis for the well-founded use of the principle. Papers I-V present the main thesis of this research. In the two last papers, the discussion is widened to new directions. The starting question is how well the currently embraced precautionary principle stands up to critical philosophical scrutiny. The approach employed is analytical: mainly conceptual, argumentative and ethical. The study draws upon Anglo-American style philosophy on the one hand, and upon sources of law as well as concrete cases and decision-making practices at the European Union level and in its member countries on the other. The framework is environmental (and health) risk governance, including the related law and policy. The main thesis of this study is that the debate on the precautionary principle needs to be shifted from the question of whether the principle (or its weak or strong interpretation) is well-grounded in general to questions about the theoretical plausibility and ethical and socio-political justifiability of specific understandings of the principle. The real picture of the precautionary principle is more complex than that found (i.e. presumed) in much of the current academic, political and public debate surrounding it. While certain presumptions and interpretations of the principle are found to be sound, others are theoretically flawed or include serious practical problems. The analysis discloses conceptual and ethical presumptions and elementary understandings of the precautionary principle, critically assesses current practices invoked in the name of the precautionary principle and public participation, and seeks to build bridges between precaution, engagement and philosophical ethics. Hence, it is intended to provide a sound basis upon which subsequent academic scrutiny can build.