887 resultados para Standardised tests
Resumo:
Clinical manifestations of lactase (LCT) deficiency include intestinal and extra-intestinal symptoms. Lactose hydrogen breath test (H2-BT) is considered the gold standard to evaluate LCT deficiency (LD). Recently, the single-nucleotide polymorphism C/T(-13910) has been associated with LD. The objectives of the present study were to evaluate the agreement between genetic testing of LCT C/T(-13910) and lactose H2-BT, and the diagnostic value of extended symptom assessment. Of the 201 patients included in the study, 194 (139 females; mean age 38, range 17-79 years, and 55 males, mean age 38, range 18-68 years) patients with clinical suspicion of LD underwent a 3-4 h H2-BT and genetic testing for LCT C/T(-13910). Patients rated five intestinal and four extra-intestinal symptoms during the H2-BT and then at home for the following 48 h. Declaring H2-BT as the gold standard, the CC(-13910) genotype had a sensitivity of 97% and a specificity of 95% with a of 0.9 in diagnosing LCT deficiency. Patients with LD had more intense intestinal symptoms 4 h following the lactose challenge included in the H2-BT. We found no difference in the intensity of extra-intestinal symptoms between patients with and without LD. Symptom assessment yielded differences for intestinal symptoms abdominal pain, bloating, borborygmi and diarrhoea between 120 min and 4 h after oral lactose challenge. Extra-intestinal symptoms (dizziness, headache and myalgia) and extension of symptom assessment up to 48 h did not consistently show different results. In conclusion, genetic testing has an excellent agreement with the standard lactose H2-BT, and it may replace breath testing for the diagnosis of LD. Extended symptom scores and assessment of extra-intestinal symptoms have limited diagnostic value in the evaluation of LD.
Resumo:
The best available test for the diagnosis of upper extremity deep venous thrombosis (UEDVT) is contrast venography. The aim of this systematic review was to assess whether the diagnostic accuracy of other tests for clinically suspected UEDVT is high enough to justify their use in clinical practise and to evaluate if any test can replace venography.
Resumo:
The use of antibiotics is highest in primary care and directly associated with antibiotic resistance in the community. We assessed regional variations in antibiotic use in primary care in Switzerland and explored prescription patterns in relation to the use of point of care tests. Defined daily doses of antibiotics per 1000 inhabitants (DDD(1000pd) ) were calculated for the year 2007 from reimbursement data of the largest Swiss health insurer, based on the anatomic therapeutic chemical classification and the DDD methodology recommended by WHO. We present ecological associations by use of descriptive and regression analysis. We analysed data from 1 067 934 adults, representing 17.1% of the Swiss population. The rate of outpatient antibiotic prescriptions in the entire population was 8.5 DDD(1000pd) , and varied between 7.28 and 11.33 DDD(1000pd) for northwest Switzerland and the Lake Geneva region. DDD(1000pd) for the three most prescribed antibiotics were 2.90 for amoxicillin and amoxicillin-clavulanate, 1.77 for fluoroquinolones, and 1.34 for macrolides. Regions with higher DDD(1000pd) showed higher seasonal variability in antibiotic use and lower use of all point of care tests. In regression analysis for each class of antibiotics, the use of any point of care test was consistently associated with fewer antibiotic prescriptions. Prescription rates of primary care physicians showed variations between Swiss regions and were lower in northwest Switzerland and in physicians using point of care tests. Ecological studies are prone to bias and whether point of care tests reduce antibiotic use has to be investigated in pragmatic primary care trials.
Resumo:
Summary The first part of this review examined ISO approval requirements and in vitro testing. In the second part, non-standardized test methods for composite materials are presented and discussed. Physical tests are primarily described. Analyses of surface gloss and alterations, as well as aging simulations of dental materials are presented. Again, the importance of laboratory tests in determining clinical outcomes is evaluated. Differences in the measurement protocols of the various testing institutes and how these differences can in?uence the results are also discussed. Because there is no standardization of test protocols, the values determined by different institutes cannot be directly compared. However, the ranking of the tested materials should be the same if a valid protocol is applied by different institutes. The modulus of elasticity, the expansion after water sorption, and the polishability of the material are all clinically relevant, whereas factors measured by other test protocols may have no clinical correlation. The handling properties of the materials are highly dependent on operators' preferences. Therefore, no standard values can be given.
Resumo:
The first part of this three-part review on the relevance of laboratory testing of composites and adhesives deals with approval requirements for composite materials. We compare the in vivo and in vitro literature data and discuss the relevance of in vitro analyses. The standardized ISO protocols are presented, with a focus on the evaluation of physical parameters. These tests all have a standardized protocol that describes the entire test set-up. The tests analyse flexural strength, depth of cure, susceptibility to ambient light, color stability, water sorption and solubility, and radiopacity. Some tests have a clinical correlation. A high flexural strength, for instance, decreases the risk of fractures of the marginal ridge in posterior restorations and incisal edge build-ups of restored anterior teeth. Other tests do not have a clinical correlation or the threshold values are too low, which results in an approval of materials that show inferior clinical properties (e.g., radiopacity). It is advantageous to know the test set-ups and the ideal threshold values to correctly interpret the material data. Overall, however, laboratory assessment alone cannot ensure the clinical success of a product.
Resumo:
INTRODUCTION: Rivaroxaban (RXA) is licensed for prophylaxis of venous thromboembolism after major orthopaedic surgery of the lower limbs. Currently, no test to quantify RXA in plasma has been validated in an inter-laboratory setting. Our study had three aims: to assess i) the feasibility of RXA quantification with a commercial anti-FXa assay, ii) its accuracy and precision in an inter-laboratory setting, and iii) the influence of 10mg of RXA on routine coagulation tests. METHODS: The same chromogenic anti-FXa assay (Hyphen BioMed) was used in all participating laboratories. RXA calibrators and sets of blinded probes (aim ii.) were prepared in vitro by spiking normal plasma. The precise RXA content was assessed by high-pressure liquid chromatography-tandem mass spectrometry. For ex-vivo studies (aim iii), plasma samples from 20 healthy volunteers taken before and 2 - 3hours after ingestion of 10mg of RXA were analyzed by participating laboratories. RESULTS: RXA can be assayed chromogenically. Among the participating laboratories, the mean accuracy and the mean coefficient of variation for precision of RXA quantification were 7.0% and 8.8%, respectively. Mean RXA concentration was 114±43?g/L .RXA significantly altered prothrombin time, activated partial thromboplastin time, factor analysis for intrinsic and extrinsic factors. Determinations of thrombin time, fibrinogen, FXIII and D-Dimer levels were not affected. CONCLUSIONS: RXA plasma levels can be quantified accurately and precisely by a chromogenic anti-FXa assay on different coagulometers in different laboratories. Ingestion of 10mg RXA results in significant alterations of both PT- and aPTT-based coagulation assays.
Resumo:
Visual results in treating neovascular age-related macular degeneration (AMD) using intravitreal injected anti-VEGF (IVT) clearly depend on injection frequency. Regarding to the European approval Ranibizumab has to be used only in cases of recurrent visual loss after the loading phase. In contrast monthly treatment--as also provided in the ANCHOR and MARINA studies--is generally allowed in Switzerland. However, it is commonly tried to reduce the injection frequency because of the particular cost situation in all health systems and of cause also due to the necessary strict monitoring and reinjection regimes, which raise management problems with increasing patient numbers. In this article the special treatment regimes of our University Eye Hospital is presented, in which a reduced injection frequency basically leads to the same increased and stable visual results as in ANCHOR and MARINA; however, needing significantly more injections as generally provided in other countries of Europe. The main focus for achieving this in a large number of patients is placed on re-structuring our outpatient flow for IVT patients with particular emphasis on patient separation and standardisation of treatment steps leading to significantly reduced time consumption per patient. Measurements of timing and patient satisfaction before and after restructuring underline its importance in order to be able to treat more patients at a high quality even in the future. The exceptional importance of spectral domain OCT measurements as the most important criterium for indicating re-treatment is illustrated.
Resumo:
The utility of quantitative Pneumocystis jirovecii PCR in clinical routine for diagnosing Pneumocystis pneumonia (PCP) in immunocompromised non-HIV patients is unknown. We analysed bronchoalveolar lavage fluid with real-time quantitative P. jirovecii PCR in 71 cases with definitive PCP defined by positive immunofluorescence (IF) tests and in 171 randomly selected patients with acute lung disease. In those patients, possible PCP cases were identified by using a novel standardised PCP probability algorithm and chart review. PCR performance was compared with IF testing, clinical judgment and the PCP probability algorithm. Quantitative P. jirovecii PCR values >1,450 pathogens·mL(-1) had a positive predictive value of 98.0% (95% CI 89.6-100.0%) for diagnosing definitive PCP. PCR values of between 1 and 1,450 pathogens·mL(-1) were associated with both colonisation and infection; thus, a cut-off between the two conditions could not be identified and diagnosis of PCP in this setting relied on IF and clinical assessment. Clinical PCP could be ruled out in 99.3% of 153 patients with negative PCR results. Quantitative PCR is useful for diagnosing PCP and is complementary to IF. PCR values of >1,450 pathogens·mL(-1) allow reliable diagnosis, whereas negative PCR results virtually exclude PCP. Intermediate values require additional clinical assessment and IF testing. On the basis of our data and for economic and logistical limitations, we propose a clinical algorithm in which IF remains the preferred first test in most cases, followed by PCR in those patients with a negative IF and strong clinical suspicion for PCP.
Resumo:
Plutonium is present in the environment as a consequence of atmospheric nuclear tests, nuclear weapons production and industrial releases over the past 50 years. To study temporal trends, a high resolution Pu record was obtained by analyzing 52 discrete samples of an alpine firn/ice core from Colle Gnifetti (Monte Rosa, 4450 m a.s.l.), dating from 1945 to 1990. The 239Pu signal was recorded directly, without decontamination or preconcentration steps, using an Inductively Coupled Plasma - Sector Field Mass Spectrometer (ICP-SFMS) equipped with an high efficiency sample introduction system, thus requiring much less sample preparation than previously reported methods. The 239Pu profile reflects the three main periods of atmospheric nuclear weapons testing: the earliest peak lasted from 1954/55 to 1958 and was caused by the first testing period reaching a maximum in 1958. Despite a temporary halt of testing in 1959/60, the Pu concentration decreased only by half with respect to the 1958 peak due to long atmospheric residence times. In 1961/62 Pu concentrations rapidly increased reaching a maximum in 1963, which was about 40% more intense than the 1958 peak. After the signing of the "Limited Test Ban Treaty" between USA and USSR in 1964, Pu deposition decreased very sharply reaching a minimum in 1967. The third period (1967-1975) is characterized by irregular Pu concentrations with smaller peaks (about 20-30% of the 1964 peak) which might be related to the deposition of Saharan dust contaminated by the French nuclear tests of the 1960s. The data presented are in very good agreement with Pu profiles previously obtained from the Col du Dome ice core (by multi-collector ICP-MS) and Belukha ice core (by Accelerator Mass Spectrometry, AMS). Although a semi-quantitative method was employed here, the results are quantitatively comparable to previously published results.
Resumo:
Adaptive radiation is usually thought to be associated with speciation, but the evolution of intraspecific polymorphisms without speciation is also possible. The radiation of cichlid fish in Lake Victoria (LV) is perhaps the most impressive example of a recent rapid adaptive radiation, with 600+ very young species. Key questions about its origin remain poorly characterized, such as the importance of speciation versus polymorphism, whether species persist on evolutionary time scales, and if speciation happens more commonly in small isolated or in large connected populations. We used 320 individuals from 105 putative species from Lakes Victoria, Edward, Kivu, Albert, Nabugabo and Saka, in a radiation-wide amplified fragment length polymorphism (AFLP) genome scan to address some of these questions. We demonstrate pervasive signatures of speciation supporting the classical model of adaptive radiation associated with speciation. A positive relationship between the age of lakes and the average genomic differentiation of their species, and a significant fraction of molecular variance explained by above-species level taxonomy suggest the persistence of species on evolutionary time scales, with radiation through sequential speciation rather than a single starburst. Finally the large gene diversity retained from colonization to individual species in every radiation suggests large effective population sizes and makes speciation in small geographical isolates unlikely.
Resumo:
This study evaluated the correlation between three strip-type, colorimetric tests and two laboratory methods with respect to the analysis of salivary buffering. The strip-type tests were saliva-check buffer, Dentobuff strip and CRT(®) Buffer test. The laboratory methods included Ericsson's laboratory method and a monotone acid/base titration to create a reference scale for the salivary titratable acidity. Additionally, defined buffer solutions were prepared and tested to simulate the carbonate, phosphate and protein buffer systems of saliva. The correlation between the methods was analysed by the Spearman's rank test. Disagreement was detected between buffering capacity values obtained with three strip-type tests that was more pronounced in case of saliva samples with medium and low buffering capacities. All strip-type tests were able to assign the hydrogencarbonate, di-hydrogenphosphate and 0.1% protein buffer solutions to the correct buffer categories. However, at 0.6% total protein concentrations, none of the test systems worked accurately. Improvements are necessary for strip-type tests because of certain disagreement with the Ericsson's laboratory method and dependence on the protein content of saliva.
Resumo:
New oral anticoagulants promise to overcome essential drawbacks of traditional substances. They have a predictable therapeutic effect, a wide therapeutic window, only limited interaction with food and drugs and can be administered p.o. with a fixed dose. On the other hand, knowledge on the laboratory management of new anticoagulants is limited. In the present article we discuss possible indications and available assays for monitoring of Rivaroxaban, Apixaban and Dabigatran. Furthermore, we discuss interpretation of routine coagulation tests during therapy with these new drugs.
Resumo:
A 7-month-old male kitten was presented with chronic constipation and retarded growth. Clinical examination revealed disproportional dwarfism with mild skeletal abnormalities and a palpable thyroid gland. The presumptive diagnosis of congenital hypothyroidism was confirmed by low serum total thyroxine (tT(4)) concentration prior to and after the administration of thyroid stimulation hormone (TSH), increased endogenous TSH concentration and abnormal thyroid scintigraphic scan. The kitten had abnormal liver function tests and decreased insulin-like growth factor 1 (IGF-1) concentration, both of which returned to normal in correspondence with an improvement of the clinical signs after 6 weeks of thyroxine therapy. Congenital hypothyroidism is a rare disease that may present with considerable variation in clinical manifestation. In cases in which clinical signs are ambiguous, disorders such as portosystemic shunt and hyposomatotropism have to be taken into account as differential diagnosis. As hypothyroidism may be associated with abnormal liver function tests and low IGF-1 concentrations, test results have to be interpreted carefully.