37 resultados para Antibiotic sensitivity test
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
Background Interferon-gamma release assays (IGRA) are more specific than the tuberculin skin test (TST) for the diagnosis of Mycobacterium tuberculosis infection. Data on sensitivity are controversial in HIV infection. Methods IGRA (T-SPOT.TB) was performed using lymphocytes stored within 6 months before culture-confirmed tuberculosis was diagnosed in HIV-infected individuals in the Swiss HIV Cohort Study. Results 64 individuals (69% males, 45% of non-white ethnicity, median age 35 years (interquartile range [IQR] 31-42), 28% with prior AIDS) were analysed. Median CD4 cell count was 223 cells/μl (IQR 103-339), HIV-RNA was 4.7 log10 copies/mL (IQR 4.3-5.2). T-SPOT.TB resulted positive in 25 patients (39%), negative in 18 (28%) and indeterminate in 21 (33%), corresponding to a sensitivity of 39% (95% CI 27-51%) if all test results were considered, and 58% (95% CI 43-74%) if indeterminate results were excluded. Sensitivity of IGRA was independent of CD4 cell count (p = 0.698). Among 44 individuals with available TST, 22 (50%) had a positive TST. Agreement between TST and IGRA was 57% (kappa = 0.14, p = 0.177), and in 34% (10/29) both tests were positive. Combining TST and IGRA (at least one test positive) resulted in an improved sensitivity of 67% (95% CI 52-81%). In multivariate analysis, older age was associated with negative results of TST and T-SPOT.TB (OR 3.07, 95% CI 1,22-7.74, p = 0.017, per 10 years older). Conclusions T-SPOT.TB and TST have similar sensitivity to detect latent TB in HIV-infected individuals. Combining TST and IGRA may help clinicians to better select HIV-infected individuals with latent tuberculosis who qualify for preventive treatment.
Resumo:
BACKGROUND Anecdotal evidence suggests that the sensitivity and specificity of a diagnostic test may vary with disease prevalence. Our objective was to investigate the associations between disease prevalence and test sensitivity and specificity using studies of diagnostic accuracy. METHODS We used data from 23 meta-analyses, each of which included 10-39 studies (416 total). The median prevalence per review ranged from 1% to 77%. We evaluated the effects of prevalence on sensitivity and specificity using a bivariate random-effects model for each meta-analysis, with prevalence as a covariate. We estimated the overall effect of prevalence by pooling the effects using the inverse variance method. RESULTS Within a given review, a change in prevalence from the lowest to highest value resulted in a corresponding change in sensitivity or specificity from 0 to 40 percentage points. This effect was statistically significant (p < 0.05) for either sensitivity or specificity in 8 meta-analyses (35%). Overall, specificity tended to be lower with higher disease prevalence; there was no such systematic effect for sensitivity. INTERPRETATION The sensitivity and specificity of a test often vary with disease prevalence; this effect is likely to be the result of mechanisms, such as patient spectrum, that affect prevalence, sensitivity and specificity. Because it may be difficult to identify such mechanisms, clinicians should use prevalence as a guide when selecting studies that most closely match their situation.
Resumo:
People with sequence-space synesthesia (SSS) report stable visuo-spatial forms corresponding to numbers, days, and months (amongst others). This type of synesthesia has intrigued scientists for over 130 years but the lack of an agreed upon tool for assessing it has held back research on this phenomenon. The present study builds on previous tests by measuring the consistency of spatial locations that is known to discriminate controls from synesthetes. We document, for the first time, the sensitivity and specificity of such a test and suggest a diagnostic cut-off point for discriminating between the groups based on the area bounded by different placement attempts with the same item.
Resumo:
Recent studies have shown that the nociceptive withdrawal reflex threshold (NWR-T) and the electrical pain threshold (EP-T) are reliable measures in pain-free populations. However, it is necessary to investigate the reliability of these measures in patients with chronic pain in order to translate these techniques from laboratory to clinic. The aims of this study were to determine the test-retest reliability of the NWR-T and EP-T after single and repeated (temporal summation) electrical stimulation in a group of patients with chronic low back pain, and to investigate the association between the NWR-T and the EP-T. To this end, 25 patients with chronic pain participated in three identical sessions, separated by 1 week in average, in which the NWR-T and the EP-T to single and repeated stimulation were measured. Test-retest reliability was assessed using intra-class correlation coefficient (ICC), coefficient of variation (CV), and Bland-Altman analysis. The association between the thresholds was assessed using the coefficient of determination (r (2)). The results showed good-to-excellent reliability for both NWR-T and EP-T in all cases, with average ICC values ranging 0.76-0.90 and average CV values ranging 12.0-17.7%. The association between thresholds was better after repeated stimulation than after single stimulation, with average r (2) values of 0.83 and 0.56, respectively. In conclusion, the NWR-T and the EP-T are reliable assessment tools for assessing the sensitivity of spinal nociceptive pathways in patients with chronic pain.
Resumo:
The use of antibiotics is highest in primary care and directly associated with antibiotic resistance in the community. We assessed regional variations in antibiotic use in primary care in Switzerland and explored prescription patterns in relation to the use of point of care tests. Defined daily doses of antibiotics per 1000 inhabitants (DDD(1000pd) ) were calculated for the year 2007 from reimbursement data of the largest Swiss health insurer, based on the anatomic therapeutic chemical classification and the DDD methodology recommended by WHO. We present ecological associations by use of descriptive and regression analysis. We analysed data from 1 067 934 adults, representing 17.1% of the Swiss population. The rate of outpatient antibiotic prescriptions in the entire population was 8.5 DDD(1000pd) , and varied between 7.28 and 11.33 DDD(1000pd) for northwest Switzerland and the Lake Geneva region. DDD(1000pd) for the three most prescribed antibiotics were 2.90 for amoxicillin and amoxicillin-clavulanate, 1.77 for fluoroquinolones, and 1.34 for macrolides. Regions with higher DDD(1000pd) showed higher seasonal variability in antibiotic use and lower use of all point of care tests. In regression analysis for each class of antibiotics, the use of any point of care test was consistently associated with fewer antibiotic prescriptions. Prescription rates of primary care physicians showed variations between Swiss regions and were lower in northwest Switzerland and in physicians using point of care tests. Ecological studies are prone to bias and whether point of care tests reduce antibiotic use has to be investigated in pragmatic primary care trials.
Resumo:
PURPOSE: The purpose of our study was to retrospectively evaluate the specificity, sensitivity and accuracy of computed tomography (CT), digital radiography (DR) and low-dose linear slit digital radiography (LSDR, Lodox(®)) in the detection of internal cocaine containers. METHODS: Institutional review board approval was obtained. The study collectively consisted of 83 patients (76 males, 7 females, 16-45 years) suspected of having incorporated cocaine drug containers. All underwent radiological imaging; a total of 135 exams were performed: nCT=35, nDR=70, nLSDR=30. An overall calculation of all "drug mules" and a specific evaluation of body packers, pushers and stuffers were performed. The gold standard was stool examination in a dedicated holding cell equipped with a drug toilet. RESULTS: There were 54 drug mules identified in this study. CT of all drug carriers showed the highest diagnostic accuracy 97.1%, sensitivity 100% and specificity 94.1%. DR in all cases was 71.4% accurate, 58.3% sensitive and 85.3% specific. LSDR of all patients with internal cocaine was 60% accurate, 57.9% sensitive and 63.4% specific. CONCLUSIONS: CT was the most accurate test studied. Therefore, the detection of internal cocaine drug packs should be performed by CT, rather than by conventional X-ray, in order to apply the most sensitive exam in the medico-legal investigation of suspected drug carriers. Nevertheless, the higher radiation applied by CT than by DR or LSDR needs to be considered. Future studies should include evaluation of low dose CT protocols in order to address germane issues and to reduce dosage.
Resumo:
Background Urinary tract infections (UTI) are frequent in outpatients. Fast pathogen identification is mandatory for shortening the time of discomfort and preventing serious complications. Urine culture needs up to 48 hours until pathogen identification. Consequently, the initial antibiotic regimen is empirical. Aim To evaluate the feasibility of qualitative urine pathogen identification by a commercially available real-time PCR blood pathogen test (SeptiFast®) and to compare the results with dipslide and microbiological culture. Design of study Pilot study with prospectively collected urine samples. Setting University hospital. Methods 82 prospectively collected urine samples from 81 patients with suspected UTI were included. Dipslide urine culture was followed by microbiological pathogen identification in dipslide positive samples. In parallel, qualitative DNA based pathogen identification (SeptiFast®) was performed in all samples. Results 61 samples were SeptiFast® positive, whereas 67 samples were dipslide culture positive. The inter-methodological concordance of positive and negative findings in the gram+, gram- and fungi sector was 371/410 (90%), 477/492 (97%) and 238/246 (97%), respectively. Sensitivity and specificity of the SeptiFast® test for the detection of an infection was 0.82 and 0.60, respectively. SeptiFast® pathogen identifications were available at least 43 hours prior to culture results. Conclusion The SeptiFast® platform identified bacterial DNA in urine specimens considerably faster compared to conventional culture. For UTI diagnosis sensitivity and specificity is limited by its present qualitative setup which does not allow pathogen quantification. Future quantitative assays may hold promise for PCR based UTI pathogen identification as a supplementation of conventional culture methods.
Resumo:
Fluorescence microlymphography (FML) is used to visualize the lymphatic capillaries. A maximum spread of the fluorescence dye of ≥ 12 mm has been suggested for the diagnosis of lymphedema. However, data on sensitivity and specificity are lacking. The aim of this study was to investigate the accuracy of FML for diagnosing lymphedema in patients with leg swelling. Patients with lower extremity swelling were clinically assessed and separated into lymphedema and non-lymphatic edema groups. FML was studied in all affected legs and the maximum spread of lymphatic capillaries was measured. Test accuracy and receiver operator characteristic (ROC) analysis was performed to assess possible threshold values that predict lymphedema. Between March 2008 and August 2011 a total of 171 patients (184 legs) with a median age of 43.5 (IQR 24, 54) years were assessed. Of those, 94 (51.1%) legs were diagnosed with lymphedema. The sensitivity, specificity, positive and negative likelihood ratio and positive and negative predictive value were 87%, 64%, 2.45, 0.20, 72% and 83% for the 12-mm cut-off level and 79%, 83%, 4.72, 0.26, 83% and 79% for the 14-mm cut-off level, respectively. The area under the ROC curve was 0.82 (95% CI: 0.76, 0.88). Sensitivity was higher in the secondary versus primary lymphedema (95.0% vs 74.3%, p = 0.045). No major adverse events were observed. In conclusion, FML is a simple and safe technique for detecting lymphedema in patients with leg swelling. A cut-off level of ≥ 14-mm maximum spread has a high sensitivity and high specificity of detecting lymphedema and should be chosen.
Resumo:
REASONS FOR PERFORMING STUDY: Insect bite hypersensitivity (IBH) is an IgE-mediated allergic dermatitis caused by bites of Culicoides and Simulium species, and improved means of diagnosis are required. OBJECTIVES: The cellular antigen simulation test (CAST) with C. nubeculosus and S. vittatum extracts was assessed in a population of IBH-affected and healthy horses. Variations in test results over a one year period and possible cross-reactivity between different insect extracts was studied. METHODS: A total of 314 mature horses were studied using the CAST. Influence of severity of clinical signs, gender and age were evaluated, and 32 horses were tested repeatedly over one year. The kappa reliability test was used to assess agreement of the test results with different insect extracts. RESULTS: Horses with IBH had significantly higher sLT release than controls with C. nubeculosus and S. vittatum. The highest diagnostic sensitivity and specificity levels were attained when using adult C. nubeculosus extracts with the CAST (78% and 97%, respectively), suggesting that most horses with IBH are sensitised against Culicoides allergens. A proportion of IBH-affected horses was found to be sensitised to allergens of Simulium spp. in addition to those of C. nubeculosus. The CAST with C. nubeculosus had positive and negative predictive values > or = 80% for a true prevalence of IBH of 12-52%. In the follow-up study, the proportion of IBH-affected horses with a positive test result ranged from 90% in November to 68% in March. Severity of clinical signs or age did not influence test results significantly. However, IBH-affected males achieved significantly more positive test results than IBH-affected females. CONCLUSIONS: The CAST with adult C. nubeculosus has high specificity and good sensitivity for diagnosis of IBH. Horses with IBH are mainly sensitised to Culicoides allergens, and some horses are additionally also sensitised to allergens in Simulium spp. POTENTIAL RELEVANCE: The CAST is likely to be a useful test for diagnosis of IBH, even allowing the identification of IBH-affected but asymptomatic horses. This test may also help in further characterisation of allergens involved in this condition.
Resumo:
The purpose of this study was to acquire information about the effect of an antibacterial and biodegradable poly-L-lactide (PLLA) coated titanium plate osteosynthesis on local infection resistance. For our in vitro and in vivo experiments, we used six-hole AO DC minifragment titanium plates. The implants were coated with biodegradable, semiamorphous PLLA (coating about 30 microm thick). This acted as a carrier substance to which either antibiotics or antiseptics were added. The antibiotic we applied was a combination of Rifampicin and fusidic acid; the antiseptic was a combination of Octenidin and Irgasan. This produced the following groups: Group I: six-hole AO DC minifragment titanium plate without PLLA; Group II: six-hole AO DC minifragment titanium plate with PLLA without antibiotics/antiseptics; Group III: six-hole AO DC minifragment titanium plate with PLLA + 3% Rifampicin and 7% fusidic acid; Group IV: six-hole AO DC minifragment titanium plate with PLLA + 2% Octenidin and 8% Irgasan. In vitro, we investigated the degradation and the release of the PLLA coating over a period of 6 weeks, the bactericidal efficacy of antibiotics/antiseptics after their release from the coating and the bacterial adhesion of Staphylococcus aureus to the implants. In vivo, we compared the infection rates in white New Zealand rabbits after titanium plate osteosynthesis of the tibia with or without antibacterial coating after local percutaneous bacterial inoculations at different concentrations (2 x 10(5)-2 x 10(8)): The plate, the contaminated soft tissues and the underlying bone were removed under sterile conditions after 28 days and quantitatively evaluated for bacterial growth. A stepwise experimental design with an "up-and-down" dosage technique was used to adjust the bacterial challenge in the area of the ID50 (50% infection dose). Statistical evaluation of the differences between the infection rates of both groups was performed using the two-sided Fisher exact test (p < 0.05). Over a period of 6 weeks, a continuous degradation of the PLLA coating of 13%, on average, was seen in vitro in 0.9% NaCl solution. The elution tests on titanium implants with antibiotic or antiseptic coatings produced average release values of 60% of the incorporated antibiotic or 62% of the incorporated antiseptic within the first 60 min. This was followed by a much slower, but nevertheless continuous, release of the incorporated antibiotic and antiseptic over days and weeks. At the end of the test period of 42 days, 20% of the incorporated antibiotic and 15% of the incorporated antiseptic had not yet been released from the coating. The antibacterial effect of the antibiotic/antiseptic is not lost by integrating it into the PLLA coating. The overall infection rate in the in vivo investigation was 50%. For Groups I and II the infection rate was both 83% (10 of 12 animals). In Groups III and IV with antibacterial coating, the infection rate was both 17% (2 of 12 animals). The ID50 in the antibacterial coated Groups III and IV was recorded as 1 x 10(8) CFU, whereas the ID50 values in the Groups I and II without antibacterial coating were a hundred times lower at 1 x 10(6) CFU, respectively. The difference between the groups with and without antibacterial coating was statistically significant (p = 0.033). Using an antibacterial biodegradable PLLA coating on titanium plates, a significant reduction of infection rate in an in vitro and in vivo investigation could be demonstrated. For the first time, to our knowledge, we were able to show, under standardized and reproducible conditions, that an antiseptic coating leads to the same reduction in infection rate as an antibiotic coating. Taking the problem of antibiotic-induced bacterial resistance into consideration, we thus regard the antiseptic coating, which shows the same level of effectiveness, as advantageous.
Resumo:
Background: A growing body of literature suggests that caregiving burden is associated with impaired immune system functioning, which may contribute to elevated morbidity and mortality risk among dementia caregivers. However, potential mechanisms linking these relationships are not well understood. The purpose of this study was to investigate whether stress-related experience of depressive symptoms and reductions in personal mastery were related to alterations in ss2-adrenergic receptor sensitivity.Methods: Spousal Alzheimer's caregivers (N = 106) completed measures assessing the extent to which they felt overloaded by their caregiving responsibilities, experienced depressive symptoms, and believed their life circumstances were under their control. We hypothesized that caregivers reporting elevated stress would report increased depressive symptoms and reduced mastery, which in turn would be associated with reduced ss2- adrenergic receptor sensitivity on peripheral blood mononuclear cells (PBMC), as assessed by in vitro isoproterenol stimulation.Results: Regression analyses indicated that overload was negatively associated with mastery (beta = -0.36, p = 0.001) and receptor sensitivity (beta = -0.24, p = 0.030), whereas mastery was positively associated with receptor sensitivity (beta = 0.29, p = 0.005). Finally, the relationship between overload and receptor sensitivity diminshed upon simultaneous entry of mastery. Sobel's test confirmed that mastery significantly mediated some of the relationship between overload and receptor sensitivity (z = -2.02, p = 0.044).Conclusions: These results suggest that a reduced sense of mastery may help explain the association between caregiving burden and reduced immune cell ss2-receptor sensitivity.
Resumo:
Although the diagnosis of Gitelman syndrome (GS) and Bartter syndrome (BS) is now feasible by genetic analysis, implementation of genetic testing for these disorders is still hampered by several difficulties, including large gene dimensions, lack of hot-spot mutations, heavy workup time, and costs. This study evaluated in a cohort of patients with genetically proven GS or BS diagnostic sensibility and specificity of a diuretic test with oral hydrochlorothiazide (HCT test). Forty-one patients with GS (22 adults, aged 25 to 57; 19 children-adolescents, aged 7 to 17) and seven patients with BS (five type I, two type III) were studied; three patients with "pseudo-BS" from surreptitious diuretic intake (two patients) or vomiting (one patient) were also included. HCT test consisted of the administration of 50 mg of HCT orally (1 mg/kg in children-adolescents) and measurement of the maximal diuretic-induced increase over basal in the subsequent 3 h of chloride fractional clearance. All but three patients with GS but no patients with BS and pseudo-BS showed blunted (<2.3%) response to HCT; patients with BS and the two patients with pseudo-BS from diuretic intake had increased response to HCT. No overlap existed between patients with GS and both patients with BS and pseudo-BS. The response to HCT test is blunted in patients with GS but not in patients with BS or nongenetic hypokalemia. In patients with the highly selected phenotype of normotensive hypokalemic alkalosis, abnormal HCT test allows prediction with a very high sensitivity and specificity of the Gitelman genotype and may avoid genotyping.
Resumo:
BACKGROUND: Patients with chemotherapy-related neutropenia and fever are usually hospitalized and treated on empirical intravenous broad-spectrum antibiotic regimens. Early diagnosis of sepsis in children with febrile neutropenia remains difficult due to non-specific clinical and laboratory signs of infection. We aimed to analyze whether IL-6 and IL-8 could define a group of patients at low risk of septicemia. METHODS: A prospective study was performed to assess the potential value of IL-6, IL-8 and C-reactive protein serum levels to predict severe bacterial infection or bacteremia in febrile neutropenic children with cancer during chemotherapy. Statistical test used: Friedman test, Wilcoxon-Test, Kruskal-Wallis H test, Mann-Whitney U-Test and Receiver Operating Characteristics. RESULTS: The analysis of cytokine levels measured at the onset of fever indicated that IL-6 and IL-8 are useful to define a possible group of patients with low risk of sepsis. In predicting bacteremia or severe bacterial infection, IL-6 was the best predictor with the optimum IL-6 cut-off level of 42 pg/ml showing a high sensitivity (90%) and specificity (85%). CONCLUSION: These findings may have clinical implications for risk-based antimicrobial treatment strategies.
Resumo:
BACKGROUND: To determine the value of the distance doubling visual acuity test in the diagnosis of nonorganic visual loss in a comparative observational case series. METHODS: Twenty-one consecutive patients with nonorganic visual acuity loss and 21 subjects with organic visual loss as controls were included. Best corrected visual acuity was tested at the normal distance of 5 meters using Landolt Cs. The patient was then repositioned and best corrected visual acuity was tested with the previous optotypes at double the distance via a mirror. RESULTS: Nonorganic visual acuity loss was identified in 21 of 21 patients. Sensitivity and specificity of distance-doubling visual acuity test in functional visual loss were found to be 100% (CI; 83%-100%) and 100% (CI; 82%-100%), respectively. CONCLUSION: Distance doubling visual acuity test is widely used to detect nonorganic visual loss. Our results show that this test has a high specificity and sensitivity to detect nonorganic visual impairment.
Resumo:
This study was undertaken to test whether recovery cycle measurements can provide useful information about the membrane potential of human muscle fibers. Multifiber responses to direct muscle stimulation through needle electrodes were recorded from the brachioradialis of healthy volunteers, and the latency changes measured as conditioning stimuli were applied at interstimulus intervals of 2-1000 ms. In all subjects, the relative refractory period (RRP), which lasted 3.27 +/- 0.45 ms (mean +/- SD, n = 12), was followed by a phase of supernormality, in which the velocity increased by 9.3 +/- 3.4% at 6.1 +/- 1.3 ms, and recovered over 1 s. A broad hump of additional supernormality was seen at around 100 ms. Extra conditioning stimuli had little effect on the early supernormality but increased the later component. The two phases of supernormality resembled early and late afterpotentials, attributable respectively to the passive decay of membrane charge and potassium accumulation in the t-tubules. Five minutes of ischemia progressively prolonged the RRP and reduced supernormality, confirming that these parameters are sensitive to membrane depolarization. Velocity recovery cycles may provide useful information about altered muscle membrane potential and t-tubule function in muscle disease. Muscle Nerve, 2008.