46 resultados para standardised tests
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Avidity tests can be used to discriminate between cattle that are acutely and chronically infected with the intracellular parasite Neospora caninum. The aim of this study was to compare the IgG avidity ELISA tests being used in four European laboratories. A coded panel of 200 bovine sera from well documented naturally and experimentally N. caninum infected animals were analysed at the participating laboratories by their respective assay systems and laboratory protocols. Comparing the numeric test results, the concordance correlation coefficients were between 0.479 and 0.776. The laboratories categorize the avidity results into the classes "low" and "high" which are considered indicative of recent and chronic infection, respectively. Three laboratories also use an "intermediate" class. When the categorized data were analysed by Kappa statistics there was moderate to substantial agreements between the laboratories. There was an overall better agreement for dichotomized results than when an intermediate class was also used. Taken together, this first ring test for N. caninum IgG avidity assays showed a moderate agreement between the assays used by the different laboratories to estimate the IgG avidity. Our experience suggests that avidity tests are sometimes less robust than conventional ELISAs. Therefore, it is essential that they are carefully standardised and their performance continuously evaluated.
Resumo:
BACKGROUND Detection of HIV-1 p24 antigen permits early identification of primary HIV infection and timely intervention to limit further spread of the infection. Principally, HIV screening should equally detect all viral variants, but reagents for a standardised test evaluation are limited. Therefore, we aimed to create an inexhaustible panel of diverse HIV-1 p24 antigens. METHODS We generated a panel of 43 recombinantly expressed virus-like particles (VLPs), containing the structural Gag proteins of HIV-1 subtypes A-H and circulating recombinant forms (CRF) CRF01_AE, CRF02_AG, CRF12_BF, CRF20_BG and group O. Eleven 4th generation antigen/antibody tests and five antigen-only tests were evaluated for their ability to detect VLPs diluted in human plasma to p24 concentrations equivalent to 50, 10 and 2 IU/ml of the WHO p24 standard. Three tests were also evaluated for their ability to detect p24 after heat-denaturation for immune-complex disruption, a pre-requisite for ultrasensitive p24 detection. RESULTS Our VLP panel exhibited an average intra-clade p24 diversity of 6.7%. Among the 4th generation tests, the Abbott Architect and Siemens Enzygnost Integral 4 had the highest sensitivity of 97.7% and 93%, respectively. Alere Determine Combo and BioRad Access were least sensitive with 10.1% and 40.3%, respectively. Antigen-only tests were slightly more sensitive than combination tests. Almost all tests detected the WHO HIV-1 p24 standard at a concentration of 2 IU/ml, but their ability to detect this input for different subtypes varied greatly. Heat-treatment lowered overall detectability of HIV-1 p24 in two of the three tests, but only few VLPs had a more than 3-fold loss in p24 detection. CONCLUSIONS The HIV-1 Gag subtype panel has a broad diversity and proved useful for a standardised evaluation of the detection limit and breadth of subtype detection of p24 antigen-detecting tests. Several tests exhibited problems, particularly with non-B subtypes.
Resumo:
In dentistry the restoration of decayed teeth is challenging and makes great demands on both the dentist and the materials. Hence, fiber-reinforced posts have been introduced. The effects of different variables on the ultimate load on teeth restored using fiber-reinforced posts is controversial, maybe because the results are mostly based on non-standardized in vitro tests and, therefore, give inhomogeneous results. This study combines the advantages of in vitro tests and finite element analysis (FEA) to clarify the effects of ferrule height, post length and cementation technique used for restoration. Sixty-four single rooted premolars were decoronated (ferrule height 1 or 2 mm), endodontically treated and restored using fiber posts (length 2 or 7 mm), composite fillings and metal crowns (resin bonded or cemented). After thermocycling and chewing simulation the samples were loaded until fracture, recording first damage events. Using UNIANOVA to analyze recorded fracture loads, ferrule height and cementation technique were found to be significant, i.e. increased ferrule height and resin bonding of the crown resulted in higher fracture loads. Post length had no significant effect. All conventionally cemented crowns with a 1-mm ferrule height failed during artificial ageing, in contrast to resin-bonded crowns (75% survival rate). FEA confirmed these results and provided information about stress and force distribution within the restoration. Based on the findings of in vitro tests and computations we concluded that crowns, especially those with a small ferrule height, should be resin bonded. Finally, centrally positioned fiber-reinforced posts did not contribute to load transfer as long as the bond between the tooth and composite core was intact.
Resumo:
It is well known that the early initiation of a specific antiinfective therapy is crucial to reduce the mortality in severe infection. Procedures culturing pathogens are the diagnostic gold standard in such diseases. However, these methods yield results earliest between 24 to 48 hours. Therefore, severe infections such as sepsis need to be treated with an empirical antimicrobial therapy, which is ineffective in an unknown fraction of these patients. Today's microbiological point of care tests are pathogen specific and therefore not appropriate for an infection with a variety of possible pathogens. Molecular nucleic acid diagnostics such as polymerase chain reaction (PCR) allow the identification of pathogens and resistances. These methods are used routinely to speed up the analysis of positive blood cultures. The newest PCR based system allows the identification of the 25 most frequent sepsis pathogens by PCR in parallel without previous culture in less than 6 hours. Thereby, these systems might shorten the time of possibly insufficient antiinfective therapy. However, these extensive tools are not suitable as point of care diagnostics. Miniaturization and automating of the nucleic acid based method is pending, as well as an increase of detectable pathogens and resistance genes by these methods. It is assumed that molecular PCR techniques will have an increasing impact on microbiological diagnostics in the future.
Resumo:
Quantitative sensory tests are widely used in human research to evaluate the effect of analgesics and explore altered pain mechanisms, such as central sensitization. In order to apply these tests in clinical practice, knowledge of reference values is essential. The aim of this study was to determine the reference values of pain thresholds for mechanical and thermal stimuli, as well as withdrawal time for the cold pressor test in 300 pain-free subjects. Pain detection and pain tolerance thresholds to pressure, heat and cold were determined at three body sites: (1) lower back, (2) suprascapular region and (3) second toe (for pressure) or the lateral aspect of the leg (for heat and cold). The influences of gender, age, height, weight, body-mass index (BMI), body side of testing, depression, anxiety, catastrophizing and parameters of Short-Form 36 (SF-36) were analyzed by multiple regressions. Quantile regressions were performed to define the 5th, 10th and 25th percentiles as reference values for pain hypersensitivity and the 75th, 90th and 95th percentiles as reference values for pain hyposensitivity. Gender, age and/or the interaction of age with gender were the only variables that consistently affected the pain measures. Women were more pain sensitive than men. However, the influence of gender decreased with increasing age. In conclusion, normative values of parameters related to pressure, heat and cold pain stimuli were determined. Reference values have to be stratified by body region, gender and age. The determination of these reference values will now allow the clinical application of the tests for detecting abnormal pain reactions in individual patients.
Resumo:
In most pathology laboratories worldwide, formalin-fixed paraffin embedded (FFPE) samples are the only tissue specimens available for routine diagnostics. Although commercial kits for diagnostic molecular pathology testing are becoming available, most of the current diagnostic tests are laboratory-based assays. Thus, there is a need for standardized procedures in molecular pathology, starting from the extraction of nucleic acids. To evaluate the current methods for extracting nucleic acids from FFPE tissues, 13 European laboratories, participating to the European FP6 program IMPACTS (www.impactsnetwork.eu), isolated nucleic acids from four diagnostic FFPE tissues using their routine methods, followed by quality assessment. The DNA-extraction protocols ranged from homemade protocols to commercial kits. Except for one homemade protocol, the majority gave comparable results in terms of the quality of the extracted DNA measured by the ability to amplify differently sized control gene fragments by PCR. For array-applications or tests that require an accurately determined DNA-input, we recommend using silica based adsorption columns for DNA recovery. For RNA extractions, the best results were obtained using chromatography column based commercial kits, which resulted in the highest quantity and best assayable RNA. Quality testing using RT-PCR gave successful amplification of 200 bp-250 bp PCR products from most tested tissues. Modifications of the proteinase-K digestion time led to better results, even when commercial kits were applied. The results of the study emphasize the need for quality control of the nucleic acid extracts with standardised methods to prevent false negative results and to allow data comparison among different diagnostic laboratories.
Resumo:
Clinical manifestations of lactase (LCT) deficiency include intestinal and extra-intestinal symptoms. Lactose hydrogen breath test (H2-BT) is considered the gold standard to evaluate LCT deficiency (LD). Recently, the single-nucleotide polymorphism C/T(-13910) has been associated with LD. The objectives of the present study were to evaluate the agreement between genetic testing of LCT C/T(-13910) and lactose H2-BT, and the diagnostic value of extended symptom assessment. Of the 201 patients included in the study, 194 (139 females; mean age 38, range 17-79 years, and 55 males, mean age 38, range 18-68 years) patients with clinical suspicion of LD underwent a 3-4 h H2-BT and genetic testing for LCT C/T(-13910). Patients rated five intestinal and four extra-intestinal symptoms during the H2-BT and then at home for the following 48 h. Declaring H2-BT as the gold standard, the CC(-13910) genotype had a sensitivity of 97% and a specificity of 95% with a of 0.9 in diagnosing LCT deficiency. Patients with LD had more intense intestinal symptoms 4 h following the lactose challenge included in the H2-BT. We found no difference in the intensity of extra-intestinal symptoms between patients with and without LD. Symptom assessment yielded differences for intestinal symptoms abdominal pain, bloating, borborygmi and diarrhoea between 120 min and 4 h after oral lactose challenge. Extra-intestinal symptoms (dizziness, headache and myalgia) and extension of symptom assessment up to 48 h did not consistently show different results. In conclusion, genetic testing has an excellent agreement with the standard lactose H2-BT, and it may replace breath testing for the diagnosis of LD. Extended symptom scores and assessment of extra-intestinal symptoms have limited diagnostic value in the evaluation of LD.
Resumo:
The best available test for the diagnosis of upper extremity deep venous thrombosis (UEDVT) is contrast venography. The aim of this systematic review was to assess whether the diagnostic accuracy of other tests for clinically suspected UEDVT is high enough to justify their use in clinical practise and to evaluate if any test can replace venography.
Resumo:
The use of antibiotics is highest in primary care and directly associated with antibiotic resistance in the community. We assessed regional variations in antibiotic use in primary care in Switzerland and explored prescription patterns in relation to the use of point of care tests. Defined daily doses of antibiotics per 1000 inhabitants (DDD(1000pd) ) were calculated for the year 2007 from reimbursement data of the largest Swiss health insurer, based on the anatomic therapeutic chemical classification and the DDD methodology recommended by WHO. We present ecological associations by use of descriptive and regression analysis. We analysed data from 1 067 934 adults, representing 17.1% of the Swiss population. The rate of outpatient antibiotic prescriptions in the entire population was 8.5 DDD(1000pd) , and varied between 7.28 and 11.33 DDD(1000pd) for northwest Switzerland and the Lake Geneva region. DDD(1000pd) for the three most prescribed antibiotics were 2.90 for amoxicillin and amoxicillin-clavulanate, 1.77 for fluoroquinolones, and 1.34 for macrolides. Regions with higher DDD(1000pd) showed higher seasonal variability in antibiotic use and lower use of all point of care tests. In regression analysis for each class of antibiotics, the use of any point of care test was consistently associated with fewer antibiotic prescriptions. Prescription rates of primary care physicians showed variations between Swiss regions and were lower in northwest Switzerland and in physicians using point of care tests. Ecological studies are prone to bias and whether point of care tests reduce antibiotic use has to be investigated in pragmatic primary care trials.
Resumo:
Summary The first part of this review examined ISO approval requirements and in vitro testing. In the second part, non-standardized test methods for composite materials are presented and discussed. Physical tests are primarily described. Analyses of surface gloss and alterations, as well as aging simulations of dental materials are presented. Again, the importance of laboratory tests in determining clinical outcomes is evaluated. Differences in the measurement protocols of the various testing institutes and how these differences can in?uence the results are also discussed. Because there is no standardization of test protocols, the values determined by different institutes cannot be directly compared. However, the ranking of the tested materials should be the same if a valid protocol is applied by different institutes. The modulus of elasticity, the expansion after water sorption, and the polishability of the material are all clinically relevant, whereas factors measured by other test protocols may have no clinical correlation. The handling properties of the materials are highly dependent on operators' preferences. Therefore, no standard values can be given.
Resumo:
The first part of this three-part review on the relevance of laboratory testing of composites and adhesives deals with approval requirements for composite materials. We compare the in vivo and in vitro literature data and discuss the relevance of in vitro analyses. The standardized ISO protocols are presented, with a focus on the evaluation of physical parameters. These tests all have a standardized protocol that describes the entire test set-up. The tests analyse flexural strength, depth of cure, susceptibility to ambient light, color stability, water sorption and solubility, and radiopacity. Some tests have a clinical correlation. A high flexural strength, for instance, decreases the risk of fractures of the marginal ridge in posterior restorations and incisal edge build-ups of restored anterior teeth. Other tests do not have a clinical correlation or the threshold values are too low, which results in an approval of materials that show inferior clinical properties (e.g., radiopacity). It is advantageous to know the test set-ups and the ideal threshold values to correctly interpret the material data. Overall, however, laboratory assessment alone cannot ensure the clinical success of a product.
Resumo:
INTRODUCTION: Rivaroxaban (RXA) is licensed for prophylaxis of venous thromboembolism after major orthopaedic surgery of the lower limbs. Currently, no test to quantify RXA in plasma has been validated in an inter-laboratory setting. Our study had three aims: to assess i) the feasibility of RXA quantification with a commercial anti-FXa assay, ii) its accuracy and precision in an inter-laboratory setting, and iii) the influence of 10mg of RXA on routine coagulation tests. METHODS: The same chromogenic anti-FXa assay (Hyphen BioMed) was used in all participating laboratories. RXA calibrators and sets of blinded probes (aim ii.) were prepared in vitro by spiking normal plasma. The precise RXA content was assessed by high-pressure liquid chromatography-tandem mass spectrometry. For ex-vivo studies (aim iii), plasma samples from 20 healthy volunteers taken before and 2 - 3hours after ingestion of 10mg of RXA were analyzed by participating laboratories. RESULTS: RXA can be assayed chromogenically. Among the participating laboratories, the mean accuracy and the mean coefficient of variation for precision of RXA quantification were 7.0% and 8.8%, respectively. Mean RXA concentration was 114±43?g/L .RXA significantly altered prothrombin time, activated partial thromboplastin time, factor analysis for intrinsic and extrinsic factors. Determinations of thrombin time, fibrinogen, FXIII and D-Dimer levels were not affected. CONCLUSIONS: RXA plasma levels can be quantified accurately and precisely by a chromogenic anti-FXa assay on different coagulometers in different laboratories. Ingestion of 10mg RXA results in significant alterations of both PT- and aPTT-based coagulation assays.
Resumo:
Visual results in treating neovascular age-related macular degeneration (AMD) using intravitreal injected anti-VEGF (IVT) clearly depend on injection frequency. Regarding to the European approval Ranibizumab has to be used only in cases of recurrent visual loss after the loading phase. In contrast monthly treatment--as also provided in the ANCHOR and MARINA studies--is generally allowed in Switzerland. However, it is commonly tried to reduce the injection frequency because of the particular cost situation in all health systems and of cause also due to the necessary strict monitoring and reinjection regimes, which raise management problems with increasing patient numbers. In this article the special treatment regimes of our University Eye Hospital is presented, in which a reduced injection frequency basically leads to the same increased and stable visual results as in ANCHOR and MARINA; however, needing significantly more injections as generally provided in other countries of Europe. The main focus for achieving this in a large number of patients is placed on re-structuring our outpatient flow for IVT patients with particular emphasis on patient separation and standardisation of treatment steps leading to significantly reduced time consumption per patient. Measurements of timing and patient satisfaction before and after restructuring underline its importance in order to be able to treat more patients at a high quality even in the future. The exceptional importance of spectral domain OCT measurements as the most important criterium for indicating re-treatment is illustrated.