257 resultados para Quantitative stability
Resumo:
BACKGROUND: PCR has the potential to detect and precisely quantify specific DNA sequences, but it is not yet often used as a fully quantitative method. A number of data collection and processing strategies have been described for the implementation of quantitative PCR. However, they can be experimentally cumbersome, their relative performances have not been evaluated systematically, and they often remain poorly validated statistically and/or experimentally. In this study, we evaluated the performance of known methods, and compared them with newly developed data processing strategies in terms of resolution, precision and robustness. RESULTS: Our results indicate that simple methods that do not rely on the estimation of the efficiency of the PCR amplification may provide reproducible and sensitive data, but that they do not quantify DNA with precision. Other evaluated methods based on sigmoidal or exponential curve fitting were generally of both poor resolution and precision. A statistical analysis of the parameters that influence efficiency indicated that it depends mostly on the selected amplicon and to a lesser extent on the particular biological sample analyzed. Thus, we devised various strategies based on individual or averaged efficiency values, which were used to assess the regulated expression of several genes in response to a growth factor. CONCLUSION: Overall, qPCR data analysis methods differ significantly in their performance, and this analysis identifies methods that provide DNA quantification estimates of high precision, robustness and reliability. These methods allow reliable estimations of relative expression ratio of two-fold or higher, and our analysis provides an estimation of the number of biological samples that have to be analyzed to achieve a given precision.
Resumo:
Résumé Des développements antérieurs, au sein de l'Institut de Géophysique de Lausanne, ont permis de développer des techniques d'acquisition sismique et de réaliser l'interprétation des données sismique 2D et 3D pour étudier la géologie de la région et notamment les différentes séquences sédimentaires du Lac Léman. Pour permettre un interprétation quantitative de la sismique en déterminant des paramètres physiques des sédiments la méthode AVO (Amplitude Versus Offset) a été appliquée. Deux campagnes sismiques lacustres, 2D et 3D, ont été acquises afin de tester la méthode AVO dans le Grand Lac sur les deltas des rivières. La géométrie d'acquisition a été repensée afin de pouvoir enregistrer les données à grands déports. Les flûtes sismiques, mises bout à bout, ont permis d'atteindre des angles d'incidence d'environ 40˚ . Des récepteurs GPS spécialement développés à cet effet, et disposés le long de la flûte, ont permis, après post-traitement des données, de déterminer la position de la flûte avec précision (± 0.5 m). L'étalonnage de nos hydrophones, réalisé dans une chambre anéchoïque, a permis de connaître leur réponse en amplitude en fonction de la fréquence. Une variation maximale de 10 dB a été mis en évidence entre les capteurs des flûtes et le signal de référence. Un traitement sismique dont l'amplitude a été conservée a été appliqué sur les données du lac. L'utilisation de l'algorithme en surface en consistante a permis de corriger les variations d'amplitude des tirs du canon à air. Les sections interceptes et gradients obtenues sur les deltas de l'Aubonne et de la Dranse ont permis de produire des cross-plots. Cette représentation permet de classer les anomalies d'amplitude en fonction du type de sédiments et de leur contenu potentiel en gaz. L'un des attributs qui peut être extrait des données 3D, est l'amplitude de la réflectivité d'une interface sismique. Ceci ajoute une composante quantitative à l'interprétation géologique d'une interface. Le fond d'eau sur le delta de l'Aubonne présente des anomalies en amplitude qui caractérisent les chenaux. L'inversion de l'équation de Zoeppritz par l'algorithme de Levenberg-Marquardt a été programmée afin d'extraire les paramètres physiques des sédiments sur ce delta. Une étude statistique des résultats de l'inversion permet de simuler la variation de l'amplitude en fonction du déport. On a obtenu un modèle dont la première couche est l'eau et dont la seconde est une couche pour laquelle V P = 1461 m∕s, ρ = 1182 kg∕m3 et V S = 383 m∕s. Abstract A system to record very high resolution (VHR) seismic data on lakes in 2D and 3D was developed at the Institute of Geophysics, University of Lausanne. Several seismic surveys carried out on Lake Geneva helped us to better understand the geology of the area and to identify sedimentary sequences. However, more sophisticated analysis of the data such as the AVO (Amplitude Versus Offset) method provides means of deciphering the detailed structure of the complex Quaternary sedimentary fill of the Lake Geneva trough. To study the physical parameters we applied the AVO method at some selected places of sediments. These areas are the Aubonne and Dranse River deltas where the configurations of the strata are relatively smooth and the discontinuities between them easy to pick. A specific layout was developed to acquire large incidence angle. 2D and 3D seismic data were acquired with streamers, deployed end to end, providing incidence angle up to 40˚ . One or more GPS antennas attached to the streamer enabled us to calculate individual hydrophone positions with an accuracy of 50 cm after post-processing of the navigation data. To ensure that our system provides correct amplitude information, our streamer sensors were calibrated in an anechoic chamber using a loudspeaker as a source. Amplitude variations between the each hydrophone were of the order of 10 dB. An amplitude correction for each hydrophone was computed and applied before processing. Amplitude preserving processing was then carried out. Intercept vs. gradient cross-plots enable us to determine that both geological discontinuities (lacustrine sediments/moraine and moraine/molasse) have well defined trends. A 3D volume collected on the Aubonne river delta was processed in order ro obtain AVO attributes. Quantitative interpretation using amplitude maps were produced and amplitude maps revealed high reflectivity in channels. Inversion of the water bottom of the Zoeppritz equation using the Levenberg-Marquadt algorithm was carried out to estimate V P , V S and ρ of sediments immediately under the lake bottom. Real-data inversion gave, under the water layer, a mud layer with V P = 1461 m∕s, ρ = 1182 kg∕m3 et V S = 383 m∕s.
Resumo:
Agricultural practices, such as spreading liquid manure or the utilisation of land as animal pastures, can result in faecal contamination of water resources. Rhodococcus coprophilus is used in microbial source tracking to indicate animal faecal contamination in water. Methods previously described for detecting of R. coprophilus in water were neither sensitive nor specific. Therefore, the aim of this study was to design and validate a new quantitative polymerase chain reaction (qPCR) to improve the detection of R. coprophilus in water. The new PCR assay was based on the R. coprophilus 16S rRNA gene. The validation showed that the new approach was specific and sensitive for deoxyribunucleic acid from target host species. Compared with other PCR assays tested in this study, the detection limit of the new qPCR was between 1 and 3 log lower. The method, including a filtration step, was further validated and successfully used in a field investigation in Switzerland. Our work demonstrated that the new detection method is sensitive and robust to detect R. coprophilus in surface and spring water. Compared with PCR assays that are available in the literature or to the culture-dependent method, the new molecular approach improves the detection of R. coprophilus.
Resumo:
For the detection and management of osteoporosis and osteoporosis-related fractures, quantitative ultrasound (QUS) is emerging as a relatively low-cost and readily accessible alternative to dual-energy X-ray absorptiometry (DXA) measurement of bone mineral density (BMD) in certain circumstances. The following is a brief, but thorough review of the existing literature with respect to the use of QUS in 6 settings: 1) assessing fragility fracture risk; 2) diagnosing osteoporosis; 3) initiating osteoporosis treatment; 4) monitoring osteoporosis treatment; 5) osteoporosis case finding; and 6) quality assurance and control. Many QUS devices exist that are quite different with respect to the parameters they measure and the strength of empirical evidence supporting their use. In general, heel QUS appears to be most tested and most effective. Overall, some, but not all, heel QUS devices are effective assessing fracture risk in some, but not all, populations, the evidence being strongest for Caucasian females over 55 years old. Otherwise, the evidence is fair with respect to certain devices allowing for the accurate diagnosis of likelihood of osteoporosis, and generally fair to poor in terms of QUS use when initiating or monitoring osteoporosis treatment. A reasonable protocol is proposed herein for case-finding purposes, which relies on a combined assessment of clinical risk factors (CR.F) and heel QUS. Finally, several recommendations are made for quality assurance and control.
Resumo:
PPARs are members of the nuclear hormone receptor superfamily and are primarily involved in lipid metabolism. The expression patterns of all 3 PPAR isotypes in 22 adult rat organs were analyzed by a quantitative ribonuclease protection assay. The data obtained allowed comparison of the expression of each isotype to the others and provided new insight into the less studied PPAR beta (NR1C2) expression and function. This isotype shows a ubiquitous expression pattern and is the most abundant of the three PPARs in all analyzed tissues except adipose tissue. Its expression is especially high in the digestive tract, in addition to kidney, heart, diaphragm, and esophagus. After an overnight fast, PPAR beta mRNA levels are dramatically down-regulated in liver and kidney by up to 80% and are rapidly restored to control levels upon refeeding. This tight nutritional regulation is independent of the circulating glucocorticoid levels and the presence of PPAR alpha, whose activity is markedly up-regulated in the liver and small intestine during fasting. Finally, PPAR gamma 2 mRNA levels are decreased by 50% during fasting in both white and brown adipose tissue. In conclusion, fasting can strongly influence PPAR expression, but in only a few selected tissues.
Resumo:
OBJECTIVE: The aim of the current study was to investigate the biomechanical stability and fixation strength provided by a posterior approach reconstruction technique to realign the craniovertebral junction.¦METHODS: We tested seven human cadaver occipito-cervical spines (occiput-C4) by applying pure moments of ±1.5 Nm on a spine tester. Each specimen was tested in the following modes: 1) intact; 2) injured; 3) spacers alone at C1-C2 articulation (S); 4) spacers plus C1-C2 Posterior Instrumentation (S+PI); and 5) spacers plus C1-C2 posterior instrumentation plus midline wiring (S+PI+MLW). C1-C2 range of motion for each construct was obtained in flexion-extension, lateral bending, and axial rotation.¦RESULTS: In all the loading modes, S, S+PI, and S+PI+MLW constructs significantly reduced range of motion compared with the intact and injured condition (P < 0.05). There was no statistical difference between any of the three instrumentation constructs (P > 0.05).¦CONCLUSIONS: This study investigated the biomechanics of the posterior approach technique for realignment of the craniovertebral junction and also made comparisons with additional posterior fixations. The stand-alone spacers were stable in all three loading modes. Posterior instrumentation increased the stability as compared to stand-alone spacers. The third point of fixation, carried out by using midline wiring, increased the stability further. However, there was not much difference in the stability imparted with the midline wiring versus without. The present study highlights the biomechanics of this novel concept and reaffirms the view that distraction of the C1-C2 articular facets and direct articular joint atlantoaxial fixation would be an ideal method of management of basilar invagination.
Resumo:
Quantitative approaches in ceramology are gaining ground in excavation reports, archaeological publications and thematic studies. Hence, a wide variety of methods are being used depending on the researchers' theoretical premise, the type of material which is examined, the context of discovery and the questions that are addressed. The round table that took place in Athens on November 2008 was intended to offer the participants the opportunity to present a selection of case studies on the basis of which methodological approaches were discussed. The aim was to define a set of guidelines for quantification that would prove to be of use to all researchers. Contents: 1) Introduction (Samuel Verdan); 2) Isthmia and beyond. How can quantification help the analysis of EIA sanctuary deposits? (Catherine Morgan); 3) Approaching aspects of cult practice and ethnicity in Early Iron Age Ephesos using quantitative analysis of a Protogeometric deposit from the Artemision (Michael Kerschner); 4) Development of a ceramic cultic assemblage: Analyzing pottery from Late Helladic IIIC through Late Geometric Kalapodi (Ivonne Kaiser, Laura-Concetta Rizzotto, Sara Strack); 5) 'Erfahrungsbericht' of application of different quantitative methods at Kalapodi (Sara Strack); 6) The Early Iron Age sanctuary at Olympia: counting sherds from the Pelopion excavations (1987-1996) (Birgitta Eder); 7) L'aire du pilier des Rhodiens à Delphes: Essai de quantification du mobilier (Jean-Marc Luce); 8) A new approach in ceramic statistical analyses: Pit 13 on Xeropolis at Lefkandi (David A. Mitchell, Irene S. Lemos); 9) Households and workshops at Early Iron Age Oropos: A quantitative approach of the fine, wheel-made pottery (Vicky Vlachou); 10) Counting sherds at Sindos: Pottery consumption and construction of identities in the Iron Age (Stefanos Gimatzidis); 11) Analyse quantitative du mobilier céramique des fouilles de Xombourgo à Ténos et le cas des supports de caisson (Jean-Sébastien Gros); 12) Defining a typology of pottery from Gortyn: The material from a pottery workshop pit, (Emanuela Santaniello); 13) Quantification of ceramics from Early Iron Age tombs (Antonis Kotsonas); 14) Quantitative analysis of the pottery from the Early Iron Age necropolis of Tsikalario on Naxos (Xenia Charalambidou); 15) Finding the Early Iron Age in field survey: Two case studies from Boeotia and Magnesia (Vladimir Stissi); 16) Pottery quantification: Some guidelines (Samuel Verdan).
Resumo:
Multi-center studies using magnetic resonance imaging facilitate studying small effect sizes, global population variance and rare diseases. The reliability and sensitivity of these multi-center studies crucially depend on the comparability of the data generated at different sites and time points. The level of inter-site comparability is still controversial for conventional anatomical T1-weighted MRI data. Quantitative multi-parameter mapping (MPM) was designed to provide MR parameter measures that are comparable across sites and time points, i.e., 1 mm high-resolution maps of the longitudinal relaxation rate (R1 = 1/T1), effective proton density (PD(*)), magnetization transfer saturation (MT) and effective transverse relaxation rate (R2(*) = 1/T2(*)). MPM was validated at 3T for use in multi-center studies by scanning five volunteers at three different sites. We determined the inter-site bias, inter-site and intra-site coefficient of variation (CoV) for typical morphometric measures [i.e., gray matter (GM) probability maps used in voxel-based morphometry] and the four quantitative parameters. The inter-site bias and CoV were smaller than 3.1 and 8%, respectively, except for the inter-site CoV of R2(*) (<20%). The GM probability maps based on the MT parameter maps had a 14% higher inter-site reproducibility than maps based on conventional T1-weighted images. The low inter-site bias and variance in the parameters and derived GM probability maps confirm the high comparability of the quantitative maps across sites and time points. The reliability, short acquisition time, high resolution and the detailed insights into the brain microstructure provided by MPM makes it an efficient tool for multi-center imaging studies.
Resumo:
Using a large prospective cohort of over 12,000 women, we determined 2 thresholds (high risk and low risk of hip fracture) to use in a 10-yr hip fracture probability model that we had previously described, a model combining the heel stiffness index measured by quantitative ultrasound (QUS) and a set of easily determined clinical risk factors (CRFs). The model identified a higher percentage of women with fractures as high risk than a previously reported risk score that combined QUS and CRF. In addition, it categorized women in a way that was quite consistent with the categorization that occurred using dual X-ray absorptiometry (DXA) and the World Health Organization (WHO) classification system; the 2 methods identified similar percentages of women with and without fractures in each of their 3 categories, but the 2 identified only in part the same women. Nevertheless, combining our composite probability model with DXA in a case findings strategy will likely further improve the detection of women at high risk of fragility hip fracture. We conclude that the currently proposed model may be of some use as an alternative to the WHO classification criteria for osteoporosis, at least when access to DXA is limited.
Resumo:
We present the application of a real-time quantitative PCR assay, previously developed to measure relative telomere length in humans and mice, to two bird species, the zebra finch Taeniopygia guttata and the Alpine swift Apus melba. This technique is based on the PCR amplification of telomeric (TTAGGG)(n) sequences using specific oligonucleotide primers. Relative telomere length is expressed as the ratio (T/S) of telomere repeat copy number (T) to control single gene copy number (S). This method is particularly useful for comparisons of individuals within species, or where the same individuals are followed longitudinally. We used glyceraldehyde-3-phosphate dehydrogenase (GAPDH) as a single control gene. In both species, we validated our PCR measurements of relative telomere length against absolute measurements of telomere length determined by the conventional method of quantifying telomere terminal restriction fragment (TRF) lengths using both the traditional Southern blot analysis (Alpine swifts) and in gel hybridization (zebra finches). As found in humans and mice, telomere lengths in the same sample measured by TRF and PCR were well correlated in both the Alpine swift and the zebra finch.. Hence, this PCR assay for measurement of bird telomeres, which is fast and requires only small amounts of genomic DNA, should open new avenues in the study of environmental factors influencing variation in telomere length, and how this variation translates into variation in cellular and whole organism senescence.
Resumo:
The mode of Na+ entry and the dynamics of intracellular Na+ concentration ([Na+]i) changes consecutive to the application of the neurotransmitter glutamate were investigated in mouse cortical astrocytes in primary culture by video fluorescence microscopy. An elevation of [Na+]i was evoked by glutamate, whose amplitude and initial rate were concentration dependent. The glutamate-evoked Na+ increase was primarily due to Na+-glutamate cotransport, as inhibition of non-NMDA ionotropic receptors by 6-cyano-7-nitroquinoxiline-2,3-dione (CNQX) only weakly diminished the response and D-aspartate, a substrate of the glutamate transporter, produced [Na+]i elevations similar to those evoked by glutamate. Non-NMDA receptor activation could nevertheless be demonstrated by preventing receptor desensitization using cyclothiazide. Thus, in normal conditions non-NMDA receptors do not contribute significantly to the glutamate-evoked Na+ response. The rate of Na+ influx decreased during glutamate application, with kinetics that correlate well with the increase in [Na+]i and which depend on the extracellular concentration of glutamate. A tight coupling between Na+ entry and Na+/K+ ATPase activity was revealed by the massive [Na+]i increase evoked by glutamate when pump activity was inhibited by ouabain. During prolonged glutamate application, [Na+]i remains elevated at a new steady-state where Na+ influx through the transporter matches Na+ extrusion through the Na+/K+ ATPase. A mathematical model of the dynamics of [Na+]i homeostasis is presented which precisely defines the critical role of Na+ influx kinetics in the establishment of the elevated steady state and its consequences on the cellular bioenergetics. Indeed, extracellular glutamate concentrations of 10 microM already markedly increase the energetic demands of the astrocytes.
Resumo:
Abstract One of the most important issues in molecular biology is to understand regulatory mechanisms that control gene expression. Gene expression is often regulated by proteins, called transcription factors which bind to short (5 to 20 base pairs),degenerate segments of DNA. Experimental efforts towards understanding the sequence specificity of transcription factors is laborious and expensive, but can be substantially accelerated with the use of computational predictions. This thesis describes the use of algorithms and resources for transcriptionfactor binding site analysis in addressing quantitative modelling, where probabilitic models are built to represent binding properties of a transcription factor and can be used to find new functional binding sites in genomes. Initially, an open-access database(HTPSELEX) was created, holding high quality binding sequences for two eukaryotic families of transcription factors namely CTF/NF1 and LEFT/TCF. The binding sequences were elucidated using a recently described experimental procedure called HTP-SELEX, that allows generation of large number (> 1000) of binding sites using mass sequencing technology. For each HTP-SELEX experiments we also provide accurate primary experimental information about the protein material used, details of the wet lab protocol, an archive of sequencing trace files, and assembled clone sequences of binding sequences. The database also offers reasonably large SELEX libraries obtained with conventional low-throughput protocols.The database is available at http://wwwisrec.isb-sib.ch/htpselex/ and and ftp://ftp.isrec.isb-sib.ch/pub/databases/htpselex. The Expectation-Maximisation(EM) algorithm is one the frequently used methods to estimate probabilistic models to represent the sequence specificity of transcription factors. We present computer simulations in order to estimate the precision of EM estimated models as a function of data set parameters(like length of initial sequences, number of initial sequences, percentage of nonbinding sequences). We observed a remarkable robustness of the EM algorithm with regard to length of training sequences and the degree of contamination. The HTPSELEX database and the benchmarked results of the EM algorithm formed part of the foundation for the subsequent project, where a statistical framework called hidden Markov model has been developed to represent sequence specificity of the transcription factors CTF/NF1 and LEF1/TCF using the HTP-SELEX experiment data. The hidden Markov model framework is capable of both predicting and classifying CTF/NF1 and LEF1/TCF binding sites. A covariance analysis of the binding sites revealed non-independent base preferences at different nucleotide positions, providing insight into the binding mechanism. We next tested the LEF1/TCF model by computing binding scores for a set of LEF1/TCF binding sequences for which relative affinities were determined experimentally using non-linear regression. The predicted and experimentally determined binding affinities were in good correlation.
Resumo:
Molecular monitoring of BCR/ABL transcripts by real time quantitative reverse transcription PCR (qRT-PCR) is an essential technique for clinical management of patients with BCR/ABL-positive CML and ALL. Though quantitative BCR/ABL assays are performed in hundreds of laboratories worldwide, results among these laboratories cannot be reliably compared due to heterogeneity in test methods, data analysis, reporting, and lack of quantitative standards. Recent efforts towards standardization have been limited in scope. Aliquots of RNA were sent to clinical test centers worldwide in order to evaluate methods and reporting for e1a2, b2a2, and b3a2 transcript levels using their own qRT-PCR assays. Total RNA was isolated from tissue culture cells that expressed each of the different BCR/ABL transcripts. Serial log dilutions were prepared, ranging from 100 to 10-5, in RNA isolated from HL60 cells. Laboratories performed 5 independent qRT-PCR reactions for each sample type at each dilution. In addition, 15 qRT-PCR reactions of the 10-3 b3a2 RNA dilution were run to assess reproducibility within and between laboratories. Participants were asked to run the samples following their standard protocols and to report cycle threshold (Ct), quantitative values for BCR/ABL and housekeeping genes, and ratios of BCR/ABL to housekeeping genes for each sample RNA. Thirty-seven (n=37) participants have submitted qRT-PCR results for analysis (36, 37, and 34 labs generated data for b2a2, b3a2, and e1a2, respectively). The limit of detection for this study was defined as the lowest dilution that a Ct value could be detected for all 5 replicates. For b2a2, 15, 16, 4, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. For b3a2, 20, 13, and 4 labs showed a limit of detection at the 10-5, 10-4, and 10-3 dilutions, respectively. For e1a2, 10, 21, 2, and 1 lab(s) showed a limit of detection at the 10-5, 10-4, 10-3, and 10-2 dilutions, respectively. Log %BCR/ABL ratio values provided a method for comparing results between the different laboratories for each BCR/ABL dilution series. Linear regression analysis revealed concordance among the majority of participant data over the 10-1 to 10-4 dilutions. The overall slope values showed comparable results among the majority of b2a2 (mean=0.939; median=0.9627; range (0.399 - 1.1872)), b3a2 (mean=0.925; median=0.922; range (0.625 - 1.140)), and e1a2 (mean=0.897; median=0.909; range (0.5174 - 1.138)) laboratory results (Fig. 1-3)). Thirty-four (n=34) out of the 37 laboratories reported Ct values for all 15 replicates and only those with a complete data set were included in the inter-lab calculations. Eleven laboratories either did not report their copy number data or used other reporting units such as nanograms or cell numbers; therefore, only 26 laboratories were included in the overall analysis of copy numbers. The median copy number was 348.4, with a range from 15.6 to 547,000 copies (approximately a 4.5 log difference); the median intra-lab %CV was 19.2% with a range from 4.2% to 82.6%. While our international performance evaluation using serially diluted RNA samples has reinforced the fact that heterogeneity exists among clinical laboratories, it has also demonstrated that performance within a laboratory is overall very consistent. Accordingly, the availability of defined BCR/ABL RNAs may facilitate the validation of all phases of quantitative BCR/ABL analysis and may be extremely useful as a tool for monitoring assay performance. Ongoing analyses of these materials, along with the development of additional control materials, may solidify consensus around their application in routine laboratory testing and possible integration in worldwide efforts to standardize quantitative BCR/ABL testing.
Resumo:
Objectives: Quantitative ultrasound (QUS) is an attractive method for assessing fracture risk because it is portable, inexpensive, without ionizing radiation, and available in areas of the world where DXA is not readily accessible or affordable. However, the diversity of QUS scanners and variability of fracture outcomes measured in different studies is an important obstacle to widespread utilisation of QUS for fracture risk assessment. We aimed in this review to assess the predictive power of heel QUS for fractures considering different characteristics of the association (QUS parameters and fracture outcomes measured, QUS devices, study populations, and independence from DXA-measured bone density).Materials/Methods : We conducted an inverse-variance randomeffects meta-analysis of prospective studies with heel QUS measures at baseline and fracture outcomes in their follow-up. Relative risks (RR) per standard deviation (SD) of different QUS parameters (broadband ultrasound attenuation [BUA], speed of sound &SOS;, stiffness index &SI;, and quantitative ultrasound index [QUI]) for various fracture outcomes (hip, vertebral, any clinical, any osteoporotic, and major osteoporotic fractures) were reported based on study questions.Results : 21 studies including 55,164 women and 13,742 men were included with a total follow-up of 279,124 person-years. All four QUS parameters were associated with risk of different fractures. For instance, RR of hip fracture for 1 SD decrease of BUA was 1.69 (95% CI 1.43-2.00), SOS was 1.96 (95% CI 1.64-2.34), SI was 2.26 (95%CI 1.71-2.99), and QUI was 1.99 (95% CI 1.49-2.67). Validated devices from different manufacturers predicted fracture risks with a similar performance (meta-regression p-values>0.05 for difference of devices). There was no sign of publication bias among the studies. QUS measures predicted fracture with a similar performance in men and women. Meta-analysis of studies with QUS measures adjusted for hip DXA showed a significant and independent association with fracture risk (RR/SD for BUA =1.34 [95%CI 1.22-1.49]).Conclusions : This study confirms that QUS of the heel using validated devices predicts risk of different fracture outcomes in elderly men and women. Further research and international collaborations are needed for standardisation of QUS parameters across various manufacturers and inclusion of QUS in fracture risk assessment tools. Disclosure of Interest : None declared.