82 resultados para tests génétiques
Resumo:
In dentistry the restoration of decayed teeth is challenging and makes great demands on both the dentist and the materials. Hence, fiber-reinforced posts have been introduced. The effects of different variables on the ultimate load on teeth restored using fiber-reinforced posts is controversial, maybe because the results are mostly based on non-standardized in vitro tests and, therefore, give inhomogeneous results. This study combines the advantages of in vitro tests and finite element analysis (FEA) to clarify the effects of ferrule height, post length and cementation technique used for restoration. Sixty-four single rooted premolars were decoronated (ferrule height 1 or 2 mm), endodontically treated and restored using fiber posts (length 2 or 7 mm), composite fillings and metal crowns (resin bonded or cemented). After thermocycling and chewing simulation the samples were loaded until fracture, recording first damage events. Using UNIANOVA to analyze recorded fracture loads, ferrule height and cementation technique were found to be significant, i.e. increased ferrule height and resin bonding of the crown resulted in higher fracture loads. Post length had no significant effect. All conventionally cemented crowns with a 1-mm ferrule height failed during artificial ageing, in contrast to resin-bonded crowns (75% survival rate). FEA confirmed these results and provided information about stress and force distribution within the restoration. Based on the findings of in vitro tests and computations we concluded that crowns, especially those with a small ferrule height, should be resin bonded. Finally, centrally positioned fiber-reinforced posts did not contribute to load transfer as long as the bond between the tooth and composite core was intact.
Resumo:
It is well known that the early initiation of a specific antiinfective therapy is crucial to reduce the mortality in severe infection. Procedures culturing pathogens are the diagnostic gold standard in such diseases. However, these methods yield results earliest between 24 to 48 hours. Therefore, severe infections such as sepsis need to be treated with an empirical antimicrobial therapy, which is ineffective in an unknown fraction of these patients. Today's microbiological point of care tests are pathogen specific and therefore not appropriate for an infection with a variety of possible pathogens. Molecular nucleic acid diagnostics such as polymerase chain reaction (PCR) allow the identification of pathogens and resistances. These methods are used routinely to speed up the analysis of positive blood cultures. The newest PCR based system allows the identification of the 25 most frequent sepsis pathogens by PCR in parallel without previous culture in less than 6 hours. Thereby, these systems might shorten the time of possibly insufficient antiinfective therapy. However, these extensive tools are not suitable as point of care diagnostics. Miniaturization and automating of the nucleic acid based method is pending, as well as an increase of detectable pathogens and resistance genes by these methods. It is assumed that molecular PCR techniques will have an increasing impact on microbiological diagnostics in the future.
Resumo:
Quantitative sensory tests are widely used in human research to evaluate the effect of analgesics and explore altered pain mechanisms, such as central sensitization. In order to apply these tests in clinical practice, knowledge of reference values is essential. The aim of this study was to determine the reference values of pain thresholds for mechanical and thermal stimuli, as well as withdrawal time for the cold pressor test in 300 pain-free subjects. Pain detection and pain tolerance thresholds to pressure, heat and cold were determined at three body sites: (1) lower back, (2) suprascapular region and (3) second toe (for pressure) or the lateral aspect of the leg (for heat and cold). The influences of gender, age, height, weight, body-mass index (BMI), body side of testing, depression, anxiety, catastrophizing and parameters of Short-Form 36 (SF-36) were analyzed by multiple regressions. Quantile regressions were performed to define the 5th, 10th and 25th percentiles as reference values for pain hypersensitivity and the 75th, 90th and 95th percentiles as reference values for pain hyposensitivity. Gender, age and/or the interaction of age with gender were the only variables that consistently affected the pain measures. Women were more pain sensitive than men. However, the influence of gender decreased with increasing age. In conclusion, normative values of parameters related to pressure, heat and cold pain stimuli were determined. Reference values have to be stratified by body region, gender and age. The determination of these reference values will now allow the clinical application of the tests for detecting abnormal pain reactions in individual patients.
Resumo:
Clinical manifestations of lactase (LCT) deficiency include intestinal and extra-intestinal symptoms. Lactose hydrogen breath test (H2-BT) is considered the gold standard to evaluate LCT deficiency (LD). Recently, the single-nucleotide polymorphism C/T(-13910) has been associated with LD. The objectives of the present study were to evaluate the agreement between genetic testing of LCT C/T(-13910) and lactose H2-BT, and the diagnostic value of extended symptom assessment. Of the 201 patients included in the study, 194 (139 females; mean age 38, range 17-79 years, and 55 males, mean age 38, range 18-68 years) patients with clinical suspicion of LD underwent a 3-4 h H2-BT and genetic testing for LCT C/T(-13910). Patients rated five intestinal and four extra-intestinal symptoms during the H2-BT and then at home for the following 48 h. Declaring H2-BT as the gold standard, the CC(-13910) genotype had a sensitivity of 97% and a specificity of 95% with a of 0.9 in diagnosing LCT deficiency. Patients with LD had more intense intestinal symptoms 4 h following the lactose challenge included in the H2-BT. We found no difference in the intensity of extra-intestinal symptoms between patients with and without LD. Symptom assessment yielded differences for intestinal symptoms abdominal pain, bloating, borborygmi and diarrhoea between 120 min and 4 h after oral lactose challenge. Extra-intestinal symptoms (dizziness, headache and myalgia) and extension of symptom assessment up to 48 h did not consistently show different results. In conclusion, genetic testing has an excellent agreement with the standard lactose H2-BT, and it may replace breath testing for the diagnosis of LD. Extended symptom scores and assessment of extra-intestinal symptoms have limited diagnostic value in the evaluation of LD.
Resumo:
The best available test for the diagnosis of upper extremity deep venous thrombosis (UEDVT) is contrast venography. The aim of this systematic review was to assess whether the diagnostic accuracy of other tests for clinically suspected UEDVT is high enough to justify their use in clinical practise and to evaluate if any test can replace venography.
Resumo:
The use of antibiotics is highest in primary care and directly associated with antibiotic resistance in the community. We assessed regional variations in antibiotic use in primary care in Switzerland and explored prescription patterns in relation to the use of point of care tests. Defined daily doses of antibiotics per 1000 inhabitants (DDD(1000pd) ) were calculated for the year 2007 from reimbursement data of the largest Swiss health insurer, based on the anatomic therapeutic chemical classification and the DDD methodology recommended by WHO. We present ecological associations by use of descriptive and regression analysis. We analysed data from 1 067 934 adults, representing 17.1% of the Swiss population. The rate of outpatient antibiotic prescriptions in the entire population was 8.5 DDD(1000pd) , and varied between 7.28 and 11.33 DDD(1000pd) for northwest Switzerland and the Lake Geneva region. DDD(1000pd) for the three most prescribed antibiotics were 2.90 for amoxicillin and amoxicillin-clavulanate, 1.77 for fluoroquinolones, and 1.34 for macrolides. Regions with higher DDD(1000pd) showed higher seasonal variability in antibiotic use and lower use of all point of care tests. In regression analysis for each class of antibiotics, the use of any point of care test was consistently associated with fewer antibiotic prescriptions. Prescription rates of primary care physicians showed variations between Swiss regions and were lower in northwest Switzerland and in physicians using point of care tests. Ecological studies are prone to bias and whether point of care tests reduce antibiotic use has to be investigated in pragmatic primary care trials.
Resumo:
Summary The first part of this review examined ISO approval requirements and in vitro testing. In the second part, non-standardized test methods for composite materials are presented and discussed. Physical tests are primarily described. Analyses of surface gloss and alterations, as well as aging simulations of dental materials are presented. Again, the importance of laboratory tests in determining clinical outcomes is evaluated. Differences in the measurement protocols of the various testing institutes and how these differences can in?uence the results are also discussed. Because there is no standardization of test protocols, the values determined by different institutes cannot be directly compared. However, the ranking of the tested materials should be the same if a valid protocol is applied by different institutes. The modulus of elasticity, the expansion after water sorption, and the polishability of the material are all clinically relevant, whereas factors measured by other test protocols may have no clinical correlation. The handling properties of the materials are highly dependent on operators' preferences. Therefore, no standard values can be given.
Resumo:
The first part of this three-part review on the relevance of laboratory testing of composites and adhesives deals with approval requirements for composite materials. We compare the in vivo and in vitro literature data and discuss the relevance of in vitro analyses. The standardized ISO protocols are presented, with a focus on the evaluation of physical parameters. These tests all have a standardized protocol that describes the entire test set-up. The tests analyse flexural strength, depth of cure, susceptibility to ambient light, color stability, water sorption and solubility, and radiopacity. Some tests have a clinical correlation. A high flexural strength, for instance, decreases the risk of fractures of the marginal ridge in posterior restorations and incisal edge build-ups of restored anterior teeth. Other tests do not have a clinical correlation or the threshold values are too low, which results in an approval of materials that show inferior clinical properties (e.g., radiopacity). It is advantageous to know the test set-ups and the ideal threshold values to correctly interpret the material data. Overall, however, laboratory assessment alone cannot ensure the clinical success of a product.
Resumo:
INTRODUCTION: Rivaroxaban (RXA) is licensed for prophylaxis of venous thromboembolism after major orthopaedic surgery of the lower limbs. Currently, no test to quantify RXA in plasma has been validated in an inter-laboratory setting. Our study had three aims: to assess i) the feasibility of RXA quantification with a commercial anti-FXa assay, ii) its accuracy and precision in an inter-laboratory setting, and iii) the influence of 10mg of RXA on routine coagulation tests. METHODS: The same chromogenic anti-FXa assay (Hyphen BioMed) was used in all participating laboratories. RXA calibrators and sets of blinded probes (aim ii.) were prepared in vitro by spiking normal plasma. The precise RXA content was assessed by high-pressure liquid chromatography-tandem mass spectrometry. For ex-vivo studies (aim iii), plasma samples from 20 healthy volunteers taken before and 2 - 3hours after ingestion of 10mg of RXA were analyzed by participating laboratories. RESULTS: RXA can be assayed chromogenically. Among the participating laboratories, the mean accuracy and the mean coefficient of variation for precision of RXA quantification were 7.0% and 8.8%, respectively. Mean RXA concentration was 114±43?g/L .RXA significantly altered prothrombin time, activated partial thromboplastin time, factor analysis for intrinsic and extrinsic factors. Determinations of thrombin time, fibrinogen, FXIII and D-Dimer levels were not affected. CONCLUSIONS: RXA plasma levels can be quantified accurately and precisely by a chromogenic anti-FXa assay on different coagulometers in different laboratories. Ingestion of 10mg RXA results in significant alterations of both PT- and aPTT-based coagulation assays.
Resumo:
Plutonium is present in the environment as a consequence of atmospheric nuclear tests, nuclear weapons production and industrial releases over the past 50 years. To study temporal trends, a high resolution Pu record was obtained by analyzing 52 discrete samples of an alpine firn/ice core from Colle Gnifetti (Monte Rosa, 4450 m a.s.l.), dating from 1945 to 1990. The 239Pu signal was recorded directly, without decontamination or preconcentration steps, using an Inductively Coupled Plasma - Sector Field Mass Spectrometer (ICP-SFMS) equipped with an high efficiency sample introduction system, thus requiring much less sample preparation than previously reported methods. The 239Pu profile reflects the three main periods of atmospheric nuclear weapons testing: the earliest peak lasted from 1954/55 to 1958 and was caused by the first testing period reaching a maximum in 1958. Despite a temporary halt of testing in 1959/60, the Pu concentration decreased only by half with respect to the 1958 peak due to long atmospheric residence times. In 1961/62 Pu concentrations rapidly increased reaching a maximum in 1963, which was about 40% more intense than the 1958 peak. After the signing of the "Limited Test Ban Treaty" between USA and USSR in 1964, Pu deposition decreased very sharply reaching a minimum in 1967. The third period (1967-1975) is characterized by irregular Pu concentrations with smaller peaks (about 20-30% of the 1964 peak) which might be related to the deposition of Saharan dust contaminated by the French nuclear tests of the 1960s. The data presented are in very good agreement with Pu profiles previously obtained from the Col du Dome ice core (by multi-collector ICP-MS) and Belukha ice core (by Accelerator Mass Spectrometry, AMS). Although a semi-quantitative method was employed here, the results are quantitatively comparable to previously published results.
Resumo:
Adaptive radiation is usually thought to be associated with speciation, but the evolution of intraspecific polymorphisms without speciation is also possible. The radiation of cichlid fish in Lake Victoria (LV) is perhaps the most impressive example of a recent rapid adaptive radiation, with 600+ very young species. Key questions about its origin remain poorly characterized, such as the importance of speciation versus polymorphism, whether species persist on evolutionary time scales, and if speciation happens more commonly in small isolated or in large connected populations. We used 320 individuals from 105 putative species from Lakes Victoria, Edward, Kivu, Albert, Nabugabo and Saka, in a radiation-wide amplified fragment length polymorphism (AFLP) genome scan to address some of these questions. We demonstrate pervasive signatures of speciation supporting the classical model of adaptive radiation associated with speciation. A positive relationship between the age of lakes and the average genomic differentiation of their species, and a significant fraction of molecular variance explained by above-species level taxonomy suggest the persistence of species on evolutionary time scales, with radiation through sequential speciation rather than a single starburst. Finally the large gene diversity retained from colonization to individual species in every radiation suggests large effective population sizes and makes speciation in small geographical isolates unlikely.
Resumo:
This study evaluated the correlation between three strip-type, colorimetric tests and two laboratory methods with respect to the analysis of salivary buffering. The strip-type tests were saliva-check buffer, Dentobuff strip and CRT(®) Buffer test. The laboratory methods included Ericsson's laboratory method and a monotone acid/base titration to create a reference scale for the salivary titratable acidity. Additionally, defined buffer solutions were prepared and tested to simulate the carbonate, phosphate and protein buffer systems of saliva. The correlation between the methods was analysed by the Spearman's rank test. Disagreement was detected between buffering capacity values obtained with three strip-type tests that was more pronounced in case of saliva samples with medium and low buffering capacities. All strip-type tests were able to assign the hydrogencarbonate, di-hydrogenphosphate and 0.1% protein buffer solutions to the correct buffer categories. However, at 0.6% total protein concentrations, none of the test systems worked accurately. Improvements are necessary for strip-type tests because of certain disagreement with the Ericsson's laboratory method and dependence on the protein content of saliva.