55 resultados para reliability and validity
Resumo:
AIMS AND OBJECTIVES: To evaluate the reliability and the factor structure of the Readiness for Hospital Discharge Scale - French version. BACKGROUND: The patient's perspective is essential when assessing risk for adverse events at hospital discharge. Developed in the USA, the Readiness for Hospital Discharge Scale is the only instrument that measures an individual's self-perception of readiness before leaving the hospital. A French version of the Readiness for Hospital Discharge Scale was developed and validated. DESIGN: Cross-sectional study. METHODS: A convenience sample of 265 older inpatients from four medical units was selected. The translation and cultural adaptation of the scale involved experts in gerontology and the French language and included back translation. The items were semantically evaluated and pretested in 10 older inpatients. The scale's psychometric properties were internally validated by using confirmatory and exploratory factor analyses. Reliability was assessed by examining the internal consistency of its items. RESULTS: Goodness-of-fit indices of the confirmatory factor analyses were not adequate, but reliability was acceptable (Cronbach's α = 0·80). Exploratory factor analysis of the French version provided results close to those described for the English version, with three similar subscales (physical and emotional readiness, coping with medical treatment and personal care), whereas the initially described Expected Support subscale was not identified in the French version. CONCLUSION: The Readiness for Hospital Discharge Scale - French version appears to be partially consistent with its original English version, but requires additional adaptation to fully take into account the Swiss context and culture to achieve its original aim. RELEVANCE TO CLINICAL PRACTICE: Assessing patient readiness for hospital discharge before leaving hospital could help nurses to improve the discharge planning process and achieve better patient preparedness and care coordination.
Resumo:
OBJECTIVE: Since 2011, the new national final examination in human medicine has been implemented in Switzerland, with a structured clinical-practical part in the OSCE format. From the perspective of the national Working Group, the current article describes the essential steps in the development, implementation and evaluation of the Federal Licensing Examination Clinical Skills (FLE CS) as well as the applied quality assurance measures. Finally, central insights gained from the last years are presented. METHODS: Based on the principles of action research, the FLE CS is in a constant state of further development. On the foundation of systematically documented experiences from previous years, in the Working Group, unresolved questions are discussed and resulting solution approaches are substantiated (planning), implemented in the examination (implementation) and subsequently evaluated (reflection). The presented results are the product of this iterative procedure. RESULTS: The FLE CS is created by experts from all faculties and subject areas in a multistage process. The examination is administered in German and French on a decentralised basis and consists of twelve interdisciplinary stations per candidate. As important quality assurance measures, the national Review Board (content validation) and the meetings of the standardised patient trainers (standardisation) have proven worthwhile. The statistical analyses show good measurement reliability and support the construct validity of the examination. Among the central insights of the past years, it has been established that the consistent implementation of the principles of action research contributes to the successful further development of the examination. CONCLUSION: The centrally coordinated, collaborative-iterative process, incorporating experts from all faculties, makes a fundamental contribution to the quality of the FLE CS. The processes and insights presented here can be useful for others planning a similar undertaking.
Resumo:
Crizotinib is a first-in-class oral anaplastic lymphoma kinase (ALK) inhibitor targeting ALK-rearranged non-small-cell lung cancer. The therapy was approved by the US FDA in August 2011 and received conditional marketing approval by the European Commission in October 2012 for advanced non-small-cell lung cancer. A break-apart FISH-based assay was jointly approved with crizotinib by the FDA. This assay and an immunohistochemistry assay that uses a D5F3 rabbit monoclonal primary antibody were also approved for marketing in Europe in October 2012. While ALK rearrangement has relatively low prevalence, a clinical benefit is exhibited in more than 85% of patients with median progression-free survival of 8-10 months. In this article, the authors summarize the therapy and alternative test strategies for identifying patients who are likely to respond to therapy, including key issues for effective and efficient testing. The key economic considerations regarding the joint companion diagnostic and therapy are also presented. Given the observed clinical benefit and relatively high cost of crizotinib therapy, companion diagnostics should be evaluated relative to response to therapy versus correlation alone whenever possible, and both high inter-rater reliability and external quality assessment programs are warranted.
Resumo:
Abstract This thesis proposes a set of adaptive broadcast solutions and an adaptive data replication solution to support the deployment of P2P applications. P2P applications are an emerging type of distributed applications that are running on top of P2P networks. Typical P2P applications are video streaming, file sharing, etc. While interesting because they are fully distributed, P2P applications suffer from several deployment problems, due to the nature of the environment on which they perform. Indeed, defining an application on top of a P2P network often means defining an application where peers contribute resources in exchange for their ability to use the P2P application. For example, in P2P file sharing application, while the user is downloading some file, the P2P application is in parallel serving that file to other users. Such peers could have limited hardware resources, e.g., CPU, bandwidth and memory or the end-user could decide to limit the resources it dedicates to the P2P application a priori. In addition, a P2P network is typically emerged into an unreliable environment, where communication links and processes are subject to message losses and crashes, respectively. To support P2P applications, this thesis proposes a set of services that address some underlying constraints related to the nature of P2P networks. The proposed services include a set of adaptive broadcast solutions and an adaptive data replication solution that can be used as the basis of several P2P applications. Our data replication solution permits to increase availability and to reduce the communication overhead. The broadcast solutions aim, at providing a communication substrate encapsulating one of the key communication paradigms used by P2P applications: broadcast. Our broadcast solutions typically aim at offering reliability and scalability to some upper layer, be it an end-to-end P2P application or another system-level layer, such as a data replication layer. Our contributions are organized in a protocol stack made of three layers. In each layer, we propose a set of adaptive protocols that address specific constraints imposed by the environment. Each protocol is evaluated through a set of simulations. The adaptiveness aspect of our solutions relies on the fact that they take into account the constraints of the underlying system in a proactive manner. To model these constraints, we define an environment approximation algorithm allowing us to obtain an approximated view about the system or part of it. This approximated view includes the topology and the components reliability expressed in probabilistic terms. To adapt to the underlying system constraints, the proposed broadcast solutions route messages through tree overlays permitting to maximize the broadcast reliability. Here, the broadcast reliability is expressed as a function of the selected paths reliability and of the use of available resources. These resources are modeled in terms of quotas of messages translating the receiving and sending capacities at each node. To allow a deployment in a large-scale system, we take into account the available memory at processes by limiting the view they have to maintain about the system. Using this partial view, we propose three scalable broadcast algorithms, which are based on a propagation overlay that tends to the global tree overlay and adapts to some constraints of the underlying system. At a higher level, this thesis also proposes a data replication solution that is adaptive both in terms of replica placement and in terms of request routing. At the routing level, this solution takes the unreliability of the environment into account, in order to maximize reliable delivery of requests. At the replica placement level, the dynamically changing origin and frequency of read/write requests are analyzed, in order to define a set of replica that minimizes communication cost.
Resumo:
INTRODUCTION: Optimal identification of subtle cognitive impairment in the primary care setting requires a very brief tool combining (a) patients' subjective impairments, (b) cognitive testing, and (c) information from informants. The present study developed a new, very quick and easily administered case-finding tool combining these assessments ('BrainCheck') and tested the feasibility and validity of this instrument in two independent studies. METHODS: We developed a case-finding tool comprised of patient-directed (a) questions about memory and depression and (b) clock drawing, and (c) the informant-directed 7-item version of the Informant Questionnaire on Cognitive Decline in the Elderly (IQCODE). Feasibility study: 52 general practitioners rated the feasibility and acceptance of the patient-directed tool. Validation study: An independent group of 288 Memory Clinic patients (mean ± SD age = 76.6 ± 7.9, education = 12.0 ± 2.6; 53.8% female) with diagnoses of mild cognitive impairment (n = 80), probable Alzheimer's disease (n = 185), or major depression (n = 23) and 126 demographically matched, cognitively healthy volunteer participants (age = 75.2 ± 8.8, education = 12.5 ± 2.7; 40% female) partook. All patient and healthy control participants were administered the patient-directed tool, and informants of 113 patient and 70 healthy control participants completed the very short IQCODE. RESULTS: Feasibility study: General practitioners rated the patient-directed tool as highly feasible and acceptable. Validation study: A Classification and Regression Tree analysis generated an algorithm to categorize patient-directed data which resulted in a correct classification rate (CCR) of 81.2% (sensitivity = 83.0%, specificity = 79.4%). Critically, the CCR of the combined patient- and informant-directed instruments (BrainCheck) reached nearly 90% (that is 89.4%; sensitivity = 97.4%, specificity = 81.6%). CONCLUSION: A new and very brief instrument for general practitioners, 'BrainCheck', combined three sources of information deemed critical for effective case-finding (that is, patients' subject impairments, cognitive testing, informant information) and resulted in a nearly 90% CCR. Thus, it provides a very efficient and valid tool to aid general practitioners in deciding whether patients with suspected cognitive impairments should be further evaluated or not ('watchful waiting').
Resumo:
Communication is an indispensable component of animal societies, yet many open questions remain regarding the factors affecting the evolution and reliability of signalling systems. A potentially important factor is the level of genetic relatedness between signallers and receivers. To quantitatively explore the role of relatedness in the evolution of reliable signals, we conducted artificial evolution over 500 generations in a system of foraging robots that can emit and perceive light signals. By devising a quantitative measure of signal reliability, and comparing independently evolving populations differing in within-group relatedness, we show a strong positive correlation between relatedness and reliability. Unrelated robots produced unreliable signals, whereas highly related robots produced signals that reliably indicated the location of the food source and thereby increased performance. Comparisons across populations also revealed that the frequency for signal production-which is often used as a proxy of signal reliability in empirical studies on animal communication-is a poor predictor of signal reliability and, accordingly, is not consistently correlated with group performance. This has important implications for our understanding of signal evolution and the empirical tools that are used to investigate communication.
The international development of the RGHQoL: a quality of life measure for recurrent genital herpes.
Resumo:
This paper describes the international development and psychometric testing of the Recurrent Genital Herpes Quality of Life Questionnaire (RGHQoL), a condition-specific quality of life (QoL) instrument. The theoretical foundation for the measure is the needs-based model of QoL and the content of the instrument was derived from in-depth qualitative interviews with relevant patients in the UK. Versions of the RGHQoL were required for the UK, USA, Italy, Germany, France and Denmark for use in international clinical trials. The results indicate that the final 20 item measure has good reliability, internal consistency and validity for all language versions. A small responsiveness study in Denmark suggested that the measure is sensitive to changes in QoL associated with the initiation of suppression treatment for recurrent genital herpes (RGH). It is concluded that the RGHQoL is a valuable instrument for inclusion in clinical trials. The psychometric properties of the instrument are such that it may also be used to monitor the progress of individual patients.
Resumo:
Executive SummaryIn Nepal, landslides are one of the major natural hazards after epidemics, killing over 100 persons per year. However, this figure is an underreported reflection of the actual impact that landslides have on livelihoods and food security in rural Nepal. With predictions of more intense rainfall patterns, landslide occurrence in the Himalayas is likely to increase and continue to be one of the major impediments to development. Due to the remoteness of many localities and lack of resources, responsibilities for disaster preparedness and response in mountain areas usually lie with the communities themselves. Everyday life is full of risk in mountains of Nepal. This is why mountain populations, as well as other populations living in harsh conditions have developed a number of coping strategies for dealing with adverse situations. Perhaps due to the dispersed and remote nature of landslides in Nepal, there have been few studies on vulnerability, coping- and mitigation strategies of landslide affected populations. There are also few recommendations available to guide authorities and populations how to reduce losses due to landslides in Nepal, and even less so, how to operationalize resilience and vulnerability.Many policy makers, international donors, NGOs and national authorities are currently asking what investments are needed to increase the so-called 'resilience' of mountain populations to deal with climate risks. However, mountain populations are already quite resilient to seasonal fluctuations, temperature variations, rainfall patterns and market prices. In spite of their resilience, they continue to live in places at risk due to high vulnerability caused by structural inequalities: access to land, resources, markets, education. This interdisciplinary thesis examines the concept of resilience by questioning its usefulness and validity as the current goal of international development and disaster risk reduction policies, its conceptual limitations and its possible scope of action. The goal of this study is two-fold: to better define and distinguish factors and relationships between resilience, vulnerability, capacities and risk; and to test and improve a participatory methodology for evaluating landslide risk that can serve as a guidance tool for improving community-based disaster risk reduction. The objective is to develop a simple methodology that can be used by NGOs, local authorities and communities to reduce losses from landslides.Through its six case studies in Central-Eastern Nepal, this study explores the relation between resilience, vulnerability and landslide risk based on interdisciplinary methods, including geological assessments of landslides, semi-structured interviews, focus groups and participatory risk mapping. For comparison, the study sites were chosen in Tehrathum, Sunsari and Dolakha Districts of Central/Eastern Nepal, to reflect a variety of landslide types, from chronic to acute, and a variety of communities, from very marginalized to very high status. The study uses the Sustainable Livelihoods Approach as its conceptual basis, which is based on the notion that access and rights to resources (natural, human/institutional, economic, environmental, physical) are the basis for coping with adversity, such as landslides. The study is also intended as a contribution to the growing literature and practices on Community Based Disaster Risk Reduction specifically adapted to landslide- prone areas.In addition to the six case studies, results include an indicator based methodology for assessing and measuring vulnerability and resilience, a composite risk assessment methodology, a typology of coping strategies and risk perceptions and a thorough analysis of the relation between risk, vulnerability and resilience. The methodology forassessing vulnerability, resilience and risk is relatively cost-effective and replicable in a low-data environment. Perhaps the major finding is that resilience is a process that defines a community's (or system's) capacity to rebound following adversity but it does not necessarily reduce vulnerability or risk, which requires addressing more structural issues related to poverty. Therefore, conclusions include a critical view of resilience as a main goal of international development and disaster risk reduction policies. It is a useful concept in the context of recovery after a disaster but it needs to be addressed in parallel with vulnerability and risk.This research was funded by an interdisciplinary grant (#26083591) from the Swiss National Science Foundation for the period 2009-2011 and a seed grant from the Faculty of Geosciences and Environment at the University of Lausanne in 2008.Résumé en françaisAu Népal, les glissements de terrain sont un des aléas les plus dévastateurs après les épidémies, causant 100 morts par an. Pourtant, ce chiffre est une sous-estimation de l'impact réel de l'effet des glissements sur les moyens de subsistance et la sécurité alimentaire au Népal. Avec des prévisions de pluies plus intenses, l'occurrence des glissements dans les Himalayas augmente et présente un obstacle au développement. Du fait de l'éloignement et du manque de ressources dans les montagnes au Népal, la responsabilité de la préparation et la réponse aux catastrophes se trouve chez les communautés elles-mêmes. Le risque fait partie de la vie quotidienne dans les montagnes du Népal. C'est pourquoi les populations montagnardes, comme d'autres populations vivant dans des milieux contraignants, ont développé des stratégies pour faire face aux situations défavorables. Peu d'études existent sur la vulnérabilité, ceci étant probablement dû à l'éloignement et pourtant, les stratégies d'adaptation et de mitigation des populations touchées par des glissements au Népal existent.Beaucoup de décideurs politiques, bailleurs de fonds, ONG et autorités nationales se demandent quels investissements sont nécessaires afin d'augmenter la 'resilience' des populations de montagne pour faire face aux changements climatiques. Pourtant, ces populations sont déjà résilientes aux fluctuations des saisons, des variations de température, des pluies et des prix des marchés. En dépit de leur résilience, ils continuent de vivre dans des endroits à fort risque à cause des vulnérabilités créées par les inégalités structurelles : l'accès à la terre, aux ressources, aux marchés et à l'éducation. Cette thèse interdisciplinaire examine le concept de la résilience en mettant en cause son utilité et sa validité en tant que but actuel des politiques internationales de développement et de réduction des risques, ainsi que ses limitations conceptuelles et ses possibles champs d'action. Le but de cette étude est double : mieux définir et distinguer les facteurs et relations entre la résilience, la vulnérabilité, les capacités et le risque ; Et tester et améliorer une méthode participative pour évaluer le risque des glissements qui peut servir en tant qu'outil indicatif pour améliorer la réduction des risques des communautés. Le but est de développer une méthodologie simple qui peut être utilisée par des ONG, autorités locales et communautés pour réduire les pertes dues aux glissements.A travers les études de cas au centre-est du Népal, cette étude explore le rapport entre la résilience, la vulnérabilité et les glissements basée sur des méthodes interdisciplinaires ; Y sont inclus des évaluations géologiques des glissements, des entretiens semi-dirigés, des discussions de groupes et des cartes de risques participatives. Pour la comparaison, les zones d'études ont été sélectionnées dans les districts de Tehrathum, Sunsari et Dolakha dans le centre-est du Népal, afin de refléter différents types de glissements, de chroniques à urgents, ainsi que différentes communautés, variant de très marginalisées à très haut statut. Pour son cadre conceptuel, cette étude s'appuie sur l'approche de moyens de subsistance durable, qui est basée sur les notions d'accès et de droit aux ressources (naturelles, humaines/institutionnelles, économiques, environnementales, physiques) et qui sont le minimum pour faire face à des situations difficiles, comme des glissements. Cette étude se veut aussi une contribution à la littérature et aux pratiques en croissantes sur la réduction des risques communautaires, spécifiquement adaptées aux zones affectées par des glissements.En plus des six études de cas, les résultats incluent une méthodologie basée sur des indicateurs pour évaluer et mesurer la vulnérabilité et la résilience, une méthodologie sur le risque composé, une typologie de stratégies d'adaptation et perceptions des risques ainsi qu'une analyse fondamentale de la relation entre risque, vulnérabilité et résilience. Les méthodologies pour l'évaluation de la vulnérabilité, de la résilience et du risque sont relativement peu coûteuses et reproductibles dans des endroits avec peu de données disponibles. Le résultat probablement le plus pertinent est que la résilience est un processus qui définit la capacité d'une communauté (ou d'un système) à rebondir suite à une situation défavorable, mais qui ne réduit pas forcement la vulnérabilité ou le risque, et qui requiert une approche plus fondamentale s'adressant aux questions de pauvreté. Les conclusions incluent une vue critique de la résilience comme but principal des politiques internationales de développement et de réduction des risques. C'est un concept utile dans le contexte de la récupération après une catastrophe mais il doit être pris en compte au même titre que la vulnérabilité et le risque.Cette recherche a été financée par un fonds interdisciplinaire (#26083591) du Fonds National Suisse pour la période 2009-2011 et un fonds de préparation de recherches par la Faculté des Géosciences et Environnement à l'Université de Lausanne en 2008.
Resumo:
OBJECTIVE: The aim of this study was to evaluate a French language version of the Adolescent Drug Abuse Diagnosis (ADAD) instrument in a Swiss sample of adolescent illicit drug and/or alcohol users. PARTICIPANTS AND SETTING: The participants in the study were 102 French-speaking adolescents aged 13-19 years who fitted the criteria of illicit drug or alcohol use (at least one substance--except tobacco--once a week during the last 3 months). They were recruited in hospitals, institutions and leisure places. Procedure. The ADAD was administered individually by trained psychologists. It was integrated into a broader protocol including alcohol and drug abuse DSM-IV diagnoses, the BDI-13 (Beck Depression Inventory), life events and treatment trajectories. RESULTS: The ADAD appears to show good inter-rater reliability; the subscales showed good internal coherence and the correlations between the composite scores and the severity ratings were moderate to high. Finally, the results confirmed good concurrent validity for three out of eight ADAD dimensions. CONCLUSIONS: The French language version of the ADAD appears to be an adequate instrument for assessing drug use and associated problems in adolescents. Despite its complexity, the instrument has acceptable validity, reliability and usefulness criteria, enabling international and transcultural comparisons.
Resumo:
Functionally relevant large scale brain dynamics operates within the framework imposed by anatomical connectivity and time delays due to finite transmission speeds. To gain insight on the reliability and comparability of large scale brain network simulations, we investigate the effects of variations in the anatomical connectivity. Two different sets of detailed global connectivity structures are explored, the first extracted from the CoCoMac database and rescaled to the spatial extent of the human brain, the second derived from white-matter tractography applied to diffusion spectrum imaging (DSI) for a human subject. We use the combination of graph theoretical measures of the connection matrices and numerical simulations to explicate the importance of both connectivity strength and delays in shaping dynamic behaviour. Our results demonstrate that the brain dynamics derived from the CoCoMac database are more complex and biologically more realistic than the one based on the DSI database. We propose that the reason for this difference is the absence of directed weights in the DSI connectivity matrix.
'Toxic' and 'Nontoxic': confirming critical terminology concepts and context for clear communication
Resumo:
If 'the dose makes the poison', and if the context of an exposure to a hazard shapes the risk as much as the innate character of the hazard itself, then what is 'toxic' and what is 'nontoxic'? This article is intended to help readers and communicators: anticipate that concepts such as 'toxic' and 'nontoxic' may have different meanings to different stakeholders in different contexts of general use, commerce, science, and the law; recognize specific situations in which terms and related information could potentially be misperceived or misinterpreted; evaluate the relevance, reliability, and other attributes of information for a given situation; control actions, assumptions, interpretations, conclusions, and decisions to avoid flaws and achieve a desired outcome; and confirm that the desired outcome has been achieved. To meet those objectives, we provide some examples of differing toxicology terminology concepts and contexts; a comprehensive decision-making framework for understanding and managing risk; along with a communication and education message and audience-planning matrix to support the involvement of all relevant stakeholders; a set of CLEAR-communication assessment criteria for use by both readers and communicators; example flaws in decision-making; a suite of three tools to assign relevance vs reliability, align know vs show, and refine perception vs reality aspects of information; and four steps to foster effective community involvement and support. The framework and supporting process are generally applicable to meeting any objective.
Resumo:
Skeletal muscle mitochondrial (Mito) and lipid droplet (Lipid) content are often measured in human translational studies. Stereological point counting allows computing Mito and Lipid volume density (Vd) from micrographs taken with transmission electron microscopes. Former studies are not specific as to the size of individual squares that make up the grids, making reproducibility difficult, particularly when different magnifications are used. Our objective was to determine which size grid would be best at predicting fractional volume efficiently without sacrificing reliability and to test a novel method to reduce sampling bias. Methods: ten subjects underwent vastus lateralis biopsies. Samples were fixed, embedded, and cut longitudinally in ultrathin sections of 60 nm. Twenty micrographs from the intramyofibrillar region were taken per subject at Ã-33,000 magnification. Different grid sizes were superimposed on each micrograph: 1,000 Ã- 1,000 nm, 500 Ã- 500 nm, and 250 Ã- 250 nm. Results: mean Mito and Lipid Vd were not statistically different across grids. Variability was greater when going from 1,000 Ã- 1,000 to 500 Ã- 500 nm grid than from 500 Ã- 500 to 250 Ã- 250 nm grid. Discussion: this study is the first to attempt to standardize grid size while keeping with the conventional stereology principles. This is all in hopes of producing replicable assessments that can be obtained universally across different studies looking at human skeletal muscle mitochondrial and lipid droplet content.
Resumo:
This article introduces the Dyadic Coping Inventory (DCI; Bodenmann, 2008) and aims (1) to investigate the reliability and aspects of the validity of the Italian and French versions of the DCI, and (2) to replicate its factor structure and reliabilities using a new Swiss German sample. Based on 216 German-, 378 Italian-, and 198 French-speaking participants, the factor structure of the original German inventory was able to be replicated by using principal components analysis in all three groups after excluding two items in the Italian and French versions. The latter were shown to be as reliable as the German version with the exception of the low reliabilities of negative dyadic coping in the French group. Confirmatory factor analyses provided additional support for delegated dyadic coping and evaluation of dyadic coping. Intercorrelations among scales were similar across all three languages groups with a few exceptions. Previous findings could be replicated in all three groups, showing that aspects of dyadic coping were more strongly related to marital quality than to dyadic communication. The use of the dyadic coping scales in the actor-partner interdependence model, the common fate model, and the mutual influence model is discussed.
Resumo:
PURPOSE OF REVIEW: The article reviews recent significant advances and current applications of the temporoparietal fascia flap (TPFF) in head and neck surgery. RECENT FINDINGS: The recent literature describes a wide span of new applications of the TPFF in many areas. Significant developments and refinements in the reconstruction of orbitomaxillary composite defects and orbital exenteration cavities are reported. The TPFF combined with alloplastic framework is gaining in importance in external ear reconstruction. Innovative prefabricated skin or soft-tissue grafts based on the TPFF are used to restore facial contour or in the reconstruction of complex facial defects. The free TPFF finds a role in laryngotracheal reconstruction as a vascular carrier to support cartilage grafts. SUMMARY: Owing to its reliability and unequalled structural properties, the TPFF still plays a central role in facial reconstruction. Future investigation will likely incorporate the free TPFF as a vascular carrier of bioengineered tissues, such as cartilage and mucosa, for various head and neck indications.
Resumo:
Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM) was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1 mm high-resolution maps of the longitudinal relaxation rate (R1 = 1/T1), effective proton density (PD(*)), magnetization transfer saturation (MT) and effective transverse relaxation rate (R2(*) = 1/T2(*)). MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV) for typical morphometric measures [i.e., gray matter (GM) probability maps used in voxel-based morphometry] and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1 and 8%, respectively, except for the inter-site CoV of R2(*) (<20%). The GM probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived GM probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.