83 resultados para Quantitative Methods
em Université de Lausanne, Switzerland
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan)
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan).
Resumo:
Les catastrophes sont souvent perçues comme des événements rapides et aléatoires. Si les déclencheurs peuvent être soudains, les catastrophes, elles, sont le résultat d'une accumulation des conséquences d'actions et de décisions inappropriées ainsi que du changement global. Pour modifier cette perception du risque, des outils de sensibilisation sont nécessaires. Des méthodes quantitatives ont été développées et ont permis d'identifier la distribution et les facteurs sous- jacents du risque.¦Le risque de catastrophes résulte de l'intersection entre aléas, exposition et vulnérabilité. La fréquence et l'intensité des aléas peuvent être influencées par le changement climatique ou le déclin des écosystèmes, la croissance démographique augmente l'exposition, alors que l'évolution du niveau de développement affecte la vulnérabilité. Chacune de ses composantes pouvant changer, le risque est dynamique et doit être réévalué périodiquement par les gouvernements, les assurances ou les agences de développement. Au niveau global, ces analyses sont souvent effectuées à l'aide de base de données sur les pertes enregistrées. Nos résultats montrent que celles-ci sont susceptibles d'être biaisées notamment par l'amélioration de l'accès à l'information. Elles ne sont pas exhaustives et ne donnent pas d'information sur l'exposition, l'intensité ou la vulnérabilité. Une nouvelle approche, indépendante des pertes reportées, est donc nécessaire.¦Les recherches présentées ici ont été mandatées par les Nations Unies et par des agences oeuvrant dans le développement et l'environnement (PNUD, l'UNISDR, la GTZ, le PNUE ou l'UICN). Ces organismes avaient besoin d'une évaluation quantitative sur les facteurs sous-jacents du risque, afin de sensibiliser les décideurs et pour la priorisation des projets de réduction des risques de désastres.¦La méthode est basée sur les systèmes d'information géographique, la télédétection, les bases de données et l'analyse statistique. Une importante quantité de données (1,7 Tb) et plusieurs milliers d'heures de calculs ont été nécessaires. Un modèle de risque global a été élaboré pour révéler la distribution des aléas, de l'exposition et des risques, ainsi que pour l'identification des facteurs de risque sous- jacent de plusieurs aléas (inondations, cyclones tropicaux, séismes et glissements de terrain). Deux indexes de risque multiples ont été générés pour comparer les pays. Les résultats incluent une évaluation du rôle de l'intensité de l'aléa, de l'exposition, de la pauvreté, de la gouvernance dans la configuration et les tendances du risque. Il apparaît que les facteurs de vulnérabilité changent en fonction du type d'aléa, et contrairement à l'exposition, leur poids décroît quand l'intensité augmente.¦Au niveau local, la méthode a été testée pour mettre en évidence l'influence du changement climatique et du déclin des écosystèmes sur l'aléa. Dans le nord du Pakistan, la déforestation induit une augmentation de la susceptibilité des glissements de terrain. Les recherches menées au Pérou (à base d'imagerie satellitaire et de collecte de données au sol) révèlent un retrait glaciaire rapide et donnent une évaluation du volume de glace restante ainsi que des scénarios sur l'évolution possible.¦Ces résultats ont été présentés à des publics différents, notamment en face de 160 gouvernements. Les résultats et les données générées sont accessibles en ligne (http://preview.grid.unep.ch). La méthode est flexible et facilement transposable à des échelles et problématiques différentes, offrant de bonnes perspectives pour l'adaptation à d'autres domaines de recherche.¦La caractérisation du risque au niveau global et l'identification du rôle des écosystèmes dans le risque de catastrophe est en plein développement. Ces recherches ont révélés de nombreux défis, certains ont été résolus, d'autres sont restés des limitations. Cependant, il apparaît clairement que le niveau de développement configure line grande partie des risques de catastrophes. La dynamique du risque est gouvernée principalement par le changement global.¦Disasters are often perceived as fast and random events. If the triggers may be sudden, disasters are the result of an accumulation of actions, consequences from inappropriate decisions and from global change. To modify this perception of risk, advocacy tools are needed. Quantitative methods have been developed to identify the distribution and the underlying factors of risk.¦Disaster risk is resulting from the intersection of hazards, exposure and vulnerability. The frequency and intensity of hazards can be influenced by climate change or by the decline of ecosystems. Population growth increases the exposure, while changes in the level of development affect the vulnerability. Given that each of its components may change, the risk is dynamic and should be reviewed periodically by governments, insurance companies or development agencies. At the global level, these analyses are often performed using databases on reported losses. Our results show that these are likely to be biased in particular by improvements in access to information. International losses databases are not exhaustive and do not give information on exposure, the intensity or vulnerability. A new approach, independent of reported losses, is necessary.¦The researches presented here have been mandated by the United Nations and agencies working in the development and the environment (UNDP, UNISDR, GTZ, UNEP and IUCN). These organizations needed a quantitative assessment of the underlying factors of risk, to raise awareness amongst policymakers and to prioritize disaster risk reduction projects.¦The method is based on geographic information systems, remote sensing, databases and statistical analysis. It required a large amount of data (1.7 Tb of data on both the physical environment and socio-economic parameters) and several thousand hours of processing were necessary. A comprehensive risk model was developed to reveal the distribution of hazards, exposure and risk, and to identify underlying risk factors. These were performed for several hazards (e.g. floods, tropical cyclones, earthquakes and landslides). Two different multiple risk indexes were generated to compare countries. The results include an evaluation of the role of the intensity of the hazard, exposure, poverty, governance in the pattern and trends of risk. It appears that the vulnerability factors change depending on the type of hazard, and contrary to the exposure, their weight decreases as the intensity increases.¦Locally, the method was tested to highlight the influence of climate change and the ecosystems decline on the hazard. In northern Pakistan, deforestation exacerbates the susceptibility of landslides. Researches in Peru (based on satellite imagery and ground data collection) revealed a rapid glacier retreat and give an assessment of the remaining ice volume as well as scenarios of possible evolution.¦These results were presented to different audiences, including in front of 160 governments. The results and data generated are made available online through an open source SDI (http://preview.grid.unep.ch). The method is flexible and easily transferable to different scales and issues, with good prospects for adaptation to other research areas. The risk characterization at a global level and identifying the role of ecosystems in disaster risk is booming. These researches have revealed many challenges, some were resolved, while others remained limitations. However, it is clear that the level of development, and more over, unsustainable development, configures a large part of disaster risk and that the dynamics of risk is primarily governed by global change.
Resumo:
PURPOSE: To review the literature on young people's perspectives on health care with a view to defining domains and indicators of youth-friendly care. METHODS: Three bibliographic databases were searched to identify studies that purportedly measured young people's perspectives on health care. Each study was assessed to identify the constructs, domains, and indicators of adolescent-friendly health care. RESULTS: Twenty-two studies were identified: 15 used quantitative methods, six used qualitative methods and one used mixed methodology. Eight domains stood out as central to young people's positive experience of care. These were: accessibility of health care; staff attitude; communication; medical competency; guideline-driven care; age appropriate environments; youth involvement in health care; and health outcomes. Staff attitudes, which included notions of respect and friendliness, appeared universally applicable, whereas other domains, such as an appropriate environment including cleanliness, were more specific to particular contexts. CONCLUSION: These eight domains provide a practical framework for assessing how well services are engaging young people. Measures of youth-friendly health care should address universally applicable indicators of youth-friendly care and may benefit from additional questions that are specific to the local health setting.
Resumo:
Résumé: Valoriser le géopatrimoine par la médiation indirecte et la visualisation des objets géomorphologiques Le géopatrimoine regroupe des objets géologiques lato sensu auxquels certaines valeurs sont attribuées, en fonction de leur intérêt pour la science, de leur rareté, de leurs particularités culturelles ou écologiques, etc. Valoriser le géopatrimoine signifie avant tout faire partager cette approche aux non-spécialistes, en expliquant ce qui fait la valeur de ces objets. Cette valorisation peut s'effectuer, entre autres, sous la forme d'une activité touristique et contribuer ainsi au développement régional. Faire comprendre l'origine, la singularité et la valeur des formes du relief implique le recours à une communication éducative, désignée par le terme de médiation. Les implications de la dimension éducative du processus, comme la prise en compte des connaissances et attentes du public, la création d'un environnement favorable à l'apprentissage ou l'attractivité du contenu, sont souvent négligées. Du point de vue conceptuel, un modèle de la médiation indirecte (c'est-à-dire au moyen de supports médiatiques) a été proposé et appliqué au développement empirique de produits de médiation et à leur évaluation. Ce modèle ne garantit pas la réussite de la communication éducative, mais contribue à créer un cadre favorable au processus. De plus, plusieurs lignes directrices pour le choix du type de média et sa mise en forme ont été définies sur la base d'une compilation de résultats de la psychologie cognitive sur l'usage des médias pour l'apprentissage. Des méthodes qualitatives et quantitatives variées ont été mobilisées : enquêtes par questionnaire ex situ et in situ, auprès des visiteurs de géomorphosites de montagne, réalisation de médias interactifs testés ensuite auprès de divers publics (parcours enregistré, pré- et post-questionnaires) et entretiens collectifs. Les résultats obtenus éclairent divers aspects de la problématique. L'étude du public a montré, par exemple, que le géotourisme possède un réel public parmi les visiteurs des sites de montagnes : trois-quarts d'entre eux expriment de l'intérêt pour des explications sur la géologie et l'évolution du paysage. Cette thèse a exploré ces aspects liés au processus d'apprentissage en se focalisant sur les médias visuels, surtout interactifs. La plupart des médias visuels couramment utilisés en géomorphologie ont été considérés. Le développement de versions interactives de ces médias sous forme d'applications web a fourni un aperçu concret des possibilités des nouvelles technologies. Les utilisateurs apprécient en particulier a richesse du contenu, le haut degré d'interactivité et la variété de ces applications. De tels médias incitent à visiter le site naturel et semblent aussi répondre aux intérêts de publics variés. Abstract: Geoheritage promotion through non-personal interpretation and visualisation of geomorphological features Geoheritage concerns all geological features lato sensu to which some values are attributed, according to their scientific interest, their rarity, their cultural or ecological dimensions, etc. Geoheritage promotion implies sharing this point of view with non-specialists, explaining what gives value to those objects. Geotourism is one of the many ways to achieve geoheritage promotion, while contributing also to regional development. In order to make non-specialists understand the origin, the specificity and the value of landforms, educational communication is needed, that is called interpretation (French: médiation). This education dimension has several, and often neglected, implications, like taking into account public's knowledge and expectations, creating a favourable learning environment, attractive design, etc. From the conceptual point of view, a model for non-personal interpretation has been proposed and applied for the empirical development and for the assessment of interpretive products. This model does not guarantee success of educational communication, but help creating a favourable environment for this process. Moreover, some guidelines were defined from a compilation of several results of cognitive psychology on media use for learning. They guide the choice of the kind of media and its design. Several qualitative and quantitative methods were applied: survey questionnaires ex situ and in situ by mountain geomorphosites visitors, interactive medias developed and then tested by different kinds of users (with usertracking, pre- and post-survey questionnaires), group interviews. The results answered different aspects of the research questions. Visitor surveys revealed for example that geotourism could attract many visitors of mountain areas: three quarters of them say they are interested in getting explanations about geology and landscape (in particular its dynamic dimensions). This thesis examined those aspects with a focus on visual medias, both statics and interactive. Most of currently used medias in geomorphology were considered. Interactive versions of those medias were developed in web applications; they gave a concrete overview on the opportunities that new technologies offer. The content richness, the high interaction level and the diversity of the applications are the most liked aspects by the users. Such medias drive to visit the natural site and seem to correspond to the interests of various kinds of publics. Zusammenfassung: Aufwertung des erdwissenschaftlichen Erbes durch mediale Vermittlung und Visualisierung von geomorphologischen Objekten Das erdwissenschaftliche Erbe besteht aus geologischen Gegebenheiten lato sensu, denen entsprechend ihrer Bedeutung für die Wissenschaft, ihrer Seltenheit, ihrer kulturellen oder ökologischen Besonderheiten usw. bestimmte Werte zugeordnet werden. Das erdwissenschaftliche Erbe aufzuwerten bedeutet in erster Linie, diesen Ansatz Nichtspezialisten näher zu bringen, indem ihnen erklärt wird, was den Wert dieser Gegebenheiten ausmacht. Dies kann etwa im Rahmen eines touristischen Angebots geschehen und so die regionale Entwicklung unterstützen. Um Entstehung, Besonderheit und Wert von Geländeformen verständlich zu machen, wird eine pädagogische Kommunikationsform verwendet, die als mediale Vermittlung (franz. médiation) bezeichnet wird. Die Bedeutung der pädagogischen Dimension des Vermittlungsprozesses wie etwa der Einbezug des Wissens und der Erwartungen des Publikums, die Gestaltung eines positiven Lernklimas oder die Attraktivität des Inhalts wird oft vernachlässigt. Auf konzeptueller Ebene wurde ein Modell der indirekten Interpretation erarbeitet (d. h. anhand von Medien), das bei der empirischen Entwicklung der Interpretationsprodukte und ihrer Evaluation Anwendung fand. Dieses Modell garantiert zwar nicht den Erfolg der pädagogischen Kommunikation. Es trägt aber dazu bei, einen für den Prozess günstigen Kontext zu schaffen. Des Weiteren wurden mehrere Richtlinien für die Wahl des Medientyps und dessen Ausgestaltung anhand einer Zusammenstellung von Resultaten der kognitiven Psychologie über den Gebrauch von Medien in Lernprozessen definiert. Es wurden verschiedene qualitative und quantitative Methoden eingesetzt: Befragung mittels Fragebogen der Besucher von geomorphologischen Geotopen im Gebirge - ex situ und in situ -, Erarbeitung von interaktiven Medien, die anschliessend anhand verschiedener Zielgruppen gestestet wurden (Aufnahme des Besuchparcours, Vor- und Nachfragebögen) sowie kollektive Interviews. Die Ergebnisse geben Aufschluss zu verschiedenen Aspekten der Fragestellung. Die Befragung des Publikums hat zum Beispiel deutlich gemacht, dass der Geotourismus unter den Besuchern von Berggebieten tatsächlich auf eine Nachfrage stösst: drei Viertel von ihnen zeigen ein Interesse für Erläuterungen zur Geologie und der Landschaftsentwicklung. Die vorliegende Doktorarbeit hat die genannten Aspekte der Lernprozesse untersucht, wobei der Fokus auf visuellen, insbesondere interaktiven Medien lag. Die meisten gängigen visuellen Medien der Geomorphologie wurden berücksichtigt. Die Entwicklung von interaktiven Versionen dieser Medien in Form von Web-Anwendungen hat die Möglichkeiten der neuen Technologien veranschaulicht. Die Benutzer schätzten insbesondere die Vielfalt des Inhalts, die hohe Interaktivität und die Diversität dieser Anwendungen. Solche Medien laden dazu ein, ein Naturgebiet zu besuchen und scheinen den Interessen der verschiedenen Publikumsgruppen entgegenzukommen.
Resumo:
In my thesis I present the findings of a multiple-case study on the CSR approach of three multinational companies, applying Basu and Palazzo's (2008) CSR-character as a process model of sensemaking, Suchman's (1995) framework on legitimation strategies, and Habermas (1996) concept of deliberative democracy. The theoretical framework is based on the assumption of a postnational constellation (Habermas, 2001) which sends multinational companies onto a process of sensemaking (Weick, 1995) with regards to their responsibilities in a globalizing world. The major reason is that mainstream CSR-concepts are based on the assumption of a liberal market economy embedded in a nation state that do not fit the changing conditions for legitimation of corporate behavior in a globalizing world. For the purpose of this study, I primarily looked at two research questions: (i) How can the CSR approach of a multinational corporation be systematized empirically? (ii) What is the impact of the changing conditions in the postnational constellation on the CSR approach of the studied multinational corporations? For the analysis, I adopted a holistic approach (Patton, 1980), combining elements of a deductive and inductive theory building methodology (Eisenhardt, 1989b; Eisenhardt & Graebner, 2007; Glaser & Strauss, 1967; Van de Ven, 1992) and rigorous qualitative data analysis. Primary data was collected through 90 semi-structured interviews in two rounds with executives and managers in three multinational companies and their respective stakeholders. Raw data originating from interview tapes, field notes, and contact sheets was processed, stored, and managed using the software program QSR NVIVO 7. In the analysis, I applied qualitative methods to strengthen the interpretative part as well as quantitative methods to identify dominating dimensions and patterns. I found three different coping behaviors that provide insights into the corporate mindset. The results suggest that multinational corporations increasingly turn towards relational approaches of CSR to achieve moral legitimacy in formalized dialogical exchanges with their stakeholders since legitimacy can no longer be derived only from a national framework. I also looked at the degree to which they have reacted to the postnational constellation by the assumption of former state duties and the underlying reasoning. The findings indicate that CSR approaches become increasingly comprehensive through integrating political strategies that reflect the growing (self-) perception of multinational companies as political actors. Based on the results, I developed a model which relates the different dimensions of corporate responsibility to the discussion on deliberative democracy, global governance and social innovation to provide guidance for multinational companies in a postnational world. With my thesis, I contribute to management research by (i) delivering a comprehensive critique of the mainstream CSR-literature and (ii) filling the gap of thorough qualitative research on CSR in a globalizing world using the CSR-character as an empirical device, and (iii) to organizational studies by further advancing a deliberative view of the firm proposed by Scherer and Palazzo (2008).
Resumo:
The aim of this study is to perform a thorough comparison of quantitative susceptibility mapping (QSM) techniques and their dependence on the assumptions made. The compared methodologies were: two iterative single orientation methodologies minimizing the l2, l1TV norm of the prior knowledge of the edges of the object, one over-determined multiple orientation method (COSMOS) and anewly proposed modulated closed-form solution (MCF). The performance of these methods was compared using a numerical phantom and in-vivo high resolution (0.65mm isotropic) brain data acquired at 7T using a new coil combination method. For all QSM methods, the relevant regularization and prior-knowledge parameters were systematically changed in order to evaluate the optimal reconstruction in the presence and absence of a ground truth. Additionally, the QSM contrast was compared to conventional gradient recalled echo (GRE) magnitude and R2* maps obtained from the same dataset. The QSM reconstruction results of the single orientation methods show comparable performance. The MCF method has the highest correlation (corrMCF=0.95, r(2)MCF =0.97) with the state of the art method (COSMOS) with additional advantage of extreme fast computation time. The l-curve method gave the visually most satisfactory balance between reduction of streaking artifacts and over-regularization with the latter being overemphasized when the using the COSMOS susceptibility maps as ground-truth. R2* and susceptibility maps, when calculated from the same datasets, although based on distinct features of the data, have a comparable ability to distinguish deep gray matter structures.
Quantitative comparison of reconstruction methods for intra-voxel fiber recovery from diffusion MRI.
Resumo:
Validation is arguably the bottleneck in the diffusion magnetic resonance imaging (MRI) community. This paper evaluates and compares 20 algorithms for recovering the local intra-voxel fiber structure from diffusion MRI data and is based on the results of the "HARDI reconstruction challenge" organized in the context of the "ISBI 2012" conference. Evaluated methods encompass a mixture of classical techniques well known in the literature such as diffusion tensor, Q-Ball and diffusion spectrum imaging, algorithms inspired by the recent theory of compressed sensing and also brand new approaches proposed for the first time at this contest. To quantitatively compare the methods under controlled conditions, two datasets with known ground-truth were synthetically generated and two main criteria were used to evaluate the quality of the reconstructions in every voxel: correct assessment of the number of fiber populations and angular accuracy in their orientation. This comparative study investigates the behavior of every algorithm with varying experimental conditions and highlights strengths and weaknesses of each approach. This information can be useful not only for enhancing current algorithms and develop the next generation of reconstruction methods, but also to assist physicians in the choice of the most adequate technique for their studies.
Resumo:
Interpretability and power of genome-wide association studies can be increased by imputing unobserved genotypes, using a reference panel of individuals genotyped at higher marker density. For many markers, genotypes cannot be imputed with complete certainty, and the uncertainty needs to be taken into account when testing for association with a given phenotype. In this paper, we compare currently available methods for testing association between uncertain genotypes and quantitative traits. We show that some previously described methods offer poor control of the false-positive rate (FPR), and that satisfactory performance of these methods is obtained only by using ad hoc filtering rules or by using a harsh transformation of the trait under study. We propose new methods that are based on exact maximum likelihood estimation and use a mixture model to accommodate nonnormal trait distributions when necessary. The new methods adequately control the FPR and also have equal or better power compared to all previously described methods. We provide a fast software implementation of all the methods studied here; our new method requires computation time of less than one computer-day for a typical genome-wide scan, with 2.5 M single nucleotide polymorphisms and 5000 individuals.
Resumo:
Five selective serotonin reuptake inhibitors (SSRIs) have been introduced recently: citalopram, fluoxetine, fluvoxamine, paroxetine and sertraline. Although no therapeutic window has been defined for SSRIs, in contrast to tricyclic antidepressants, analytical methods for therapeutic drug monitoring of SSRIs are useful in several instances. SSRIs differ widely in their chemical structure and in their metabolism. The fact that some of them have N-demethylated metabolites, which are also SSRIs, requires that methods be available which allow therapeutic drug monitoring of the parent compounds and of these active metabolites. most procedures are based on prepurification of the SSRIs by liquid-liquid extraction before they are submitted to separation by chromatographic procedures (high-performance liquid chromatography, gas chromatography, thin layer chromatography) and detection by various detectors (UV, fluorescence, electrochemical detector, nitrogen-phosphorus detector, mass spectrometry). This literature review shows that most methods allow quantitative determination of SSRIs in plasma, in the lower ng/ml range, and that they are, therefore, suitable for therapeutic drug monitoring purposes of this category of drugs.
Resumo:
Molecular monitoring of BCR/ABL transcripts by real time quantitative reverse transcription PCR (qRT-PCR) is an essential technique for clinical management of patients with BCR/ABL-positive CML and ALL. Though quantitative BCR/ABL assays are performed in hundreds of laboratories worldwide, results among these laboratories cannot be reliably compared due to heterogeneity in test methods, data analysis, reporting, and lack of quantitative standards. Recent efforts towards standardization have been limited in scope. Aliquots of RNA were sent to clinical test centers worldwide in order to evaluate methods and reporting for e1a2, b2a2, and b3a2 transcript levels using their own qRT-PCR assays. Total RNA was isolated from tissue culture cells that expressed each of the different BCR/ABL transcripts. Serial log dilutions were prepared, ranging from 100 to 10-5, in RNA isolated from HL60 cells. Laboratories performed 5 independent qRT-PCR reactions for each sample type at each dilution. In addition, 15 qRT-PCR reactions of the 10-3 b3a2 RNA dilution were run to assess reproducibility within and between laboratories. Participants were asked to run the samples following their standard protocols and to report cycle threshold (Ct), quantitative values for BCR/ABL and housekeeping genes, and ratios of BCR/ABL to housekeeping genes for each sample RNA. Thirty-seven (n=37) participants have submitted qRT-PCR results for analysis (36, 37, and 34 labs generated data for b2a2, b3a2, and e1a2, respectively). The limit of detection for this study was defined as the lowest dilution that a Ct value could be detected for all 5 replicates. For b2a2, 15, 16, 4, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. For b3a2, 20, 13, and 4 labs showed a limit of detection at the 10-5, 10-4, and 10-3 dilutions, respectively. For e1a2, 10, 21, 2, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. Log %BCR/ABL ratio values provided a method for comparing results between the different laboratories for each BCR/ABL dilution series. Linear regression analysis revealed concordance among the majority of participant data over the 10-1 to 10-4 dilutions. The overall slope values showed comparable results among the majority of b2a2 (mean=0.939; median=0.9627; range (0.399 - 1.1872)), b3a2 (mean=0.925; median=0.922; range (0.625 - 1.140)), and e1a2 (mean=0.897; median=0.909; range (0.5174 - 1.138)) laboratory results (Fig. 1-3)). Thirty-four (n=34) out of the 37 laboratories reported Ct values for all 15 replicates and only those with a complete data set were included in the inter-lab calculations. Eleven laboratories either did not report their copy number data or used other reporting units such as nanograms or cell numbers; therefore, only 26 laboratories were included in the overall analysis of copy numbers. The median copy number was 348.4, with a range from 15.6 to 547,000 copies (approximately a 4.5 log difference); the median intra-lab %CV was 19.2% with a range from 4.2% to 82.6%. While our international performance evaluation using serially diluted RNA samples has reinforced the fact that heterogeneity exists among clinical laboratories, it has also demonstrated that performance within a laboratory is overall very consistent. Accordingly, the availability of defined BCR/ABL RNAs may facilitate the validation of all phases of quantitative BCR/ABL analysis and may be extremely useful as a tool for monitoring assay performance. Ongoing analyses of these materials, along with the development of additional control materials, may solidify consensus around their application in routine laboratory testing and possible integration in worldwide efforts to standardize quantitative BCR/ABL testing.
Resumo:
The World Health Organization (WHO) criteria for the diagnosis of osteoporosis are mainly applicable for dual X-ray absorptiometry (DXA) measurements at the spine and hip levels. There is a growing demand for cheaper devices, free of ionizing radiation such as promising quantitative ultrasound (QUS). In common with many other countries, QUS measurements are increasingly used in Switzerland without adequate clinical guidelines. The T-score approach developed for DXA cannot be applied to QUS, although well-conducted prospective studies have shown that ultrasound could be a valuable predictor of fracture risk. As a consequence, an expert committee named the Swiss Quality Assurance Project (SQAP, for which the main mission is the establishment of quality assurance procedures for DXA and QUS in Switzerland) was mandated by the Swiss Association Against Osteoporosis (ASCO) in 2000 to propose operational clinical recommendations for the use of QUS in the management of osteoporosis for two QUS devices sold in Switzerland. Device-specific weighted "T-score" based on the risk of osteoporotic hip fractures as well as on the prediction of DXA osteoporosis at the hip, according to the WHO definition of osteoporosis, were calculated for the Achilles (Lunar, General Electric, Madison, Wis.) and Sahara (Hologic, Waltham, Mass.) ultrasound devices. Several studies (totaling a few thousand subjects) were used to calculate age-adjusted odd ratios (OR) and area under the receiver operating curve (AUC) for the prediction of osteoporotic fracture (taking into account a weighting score depending on the design of the study involved in the calculation). The ORs were 2.4 (1.9-3.2) and AUC 0.72 (0.66-0.77), respectively, for the Achilles, and 2.3 (1.7-3.1) and 0.75 (0.68-0.82), respectively, for the Sahara device. To translate risk estimates into thresholds for clinical application, 90% sensitivity was used to define low fracture and low osteoporosis risk, and a specificity of 80% was used to define subjects as being at high risk of fracture or having osteoporosis at the hip. From the combination of the fracture model with the hip DXA osteoporotic model, we found a T-score threshold of -1.2 and -2.5 for the stiffness (Achilles) determining, respectively, the low- and high-risk subjects. Similarly, we found a T-score at -1.0 and -2.2 for the QUI index (Sahara). Then a screening strategy combining QUS, DXA, and clinical factors for the identification of women needing treatment was proposed. The application of this approach will help to minimize the inappropriate use of QUS from which the whole field currently suffers.
Resumo:
The timing and the organization of sleep architecture are mainly controlled by the circadian system, while sleep need and intensity are regulated by a homeostatic process. How independent these two systems are in regulating sleep is not well understood. In contrast to the impressive progress in the molecular genetics of circadian rhythms, little is known about the molecular basis of sleep. Nevertheless, as summarized here, phenotypic dissection of sleep into its most basic aspects can be used to identify both the single major genes and small effect quantitative trait loci involved. Although experimental models such as the mouse are more readily amenable to genetic analysis of sleep, similar approaches can be applied to humans.
Resumo:
Quantitative ultrasound (QUS) appears to be developing into an acceptable, low-cost and readily-accessible alternative to dual X-ray absorptiometry (DXA) measurements of bone mineral density (BMD) in the detection and management of osteoporosis. Perhaps the major difficulty with their widespread use is that many different QUS devices exist that differ substantially from each other, in terms of the parameters they measure and the strength of empirical evidence supporting their use. But another problem is that virtually no data exist outside of Caucasian or Asian populations. In general, heel QUS appears to be most tested and most effective. Some, but not all heel QUS devices are effective assessing fracture risk in some, but not all populations, the evidence being strongest for Caucasian females > 55 years old, though some evidence exists for Asian females > 55 and for Caucasian and Asian males > 70. Certain devices may allow to estimate the likelihood of osteoporosis, but very limited evidence exists supporting QUS use during the initiation or monitoring of osteoporosis treatment. Likely, QUS is most effective when combined with an assessment of clinical risk factors (CRF); with DXA reserved for individuals who are not identified as either high or low risk using QUS and CRF. However, monitoring and maintenance of test and instrument accuracy, precision and reproducibility are essential if QUS devices are to be used in clinical practice; and further scientific research in non-Caucasian, non-Asian populations clearly is compulsory to validate this tool for more widespread use.
Resumo:
BACKGROUND: The value of adenovirus plasma DNA detection as an indicator for adenovirus disease is unknown in the context of T cell-replete hematopoietic cell transplantation, of which adenovirus disease is an uncommon but serious complication. METHODS: Three groups of 62 T cell-replete hematopoietic cell transplant recipients were selected and tested for adenovirus in plasma by polymerase chain reaction. RESULTS: Adenovirus was detected in 21 (87.5%) of 24 patients with proven adenovirus disease (group 1), in 4 (21%) of 19 patients who shed adenovirus (group 2), and in 1 (10.5%) of 19 uninfected control patients. The maximum viral load was significantly higher in group 1 (median maximum viral load, 6.3x10(6) copies/mL; range, 0 to 1.0x10(9) copies/mL) than in group 2 (median maximum viral load, 0 copies/mL; range, 0 to 1.7x10(8) copies/mL; P<.001) and in group 3 (median maximum viral load, 0 copies/mL; range 0-40 copies/mL; P<.001). All patients in group 2 who developed adenoviremia had symptoms compatible with adenovirus disease (i.e., possible disease). A minimal plasma viral load of 10(3) copies/mL was detected in all patients with proven or possible disease. Adenoviremia was detectable at a median of 19.5 days (range, 8-48 days) and 24 days (range, 9-41 days) before death for patients with proven and possible adenovirus disease, respectively. CONCLUSION: Sustained or high-level adenoviremia appears to be a specific and sensitive indicator of adenovirus disease after T cell-replete hematopoietic cell transplantation. In the context of low prevalence of adenovirus disease, the use of polymerase chain reaction of plasma specimens to detect virus might be a valuable tool to identify and treat patients at risk for viral invasive disease.