997 resultados para 9-83
Resumo:
The use of molecular tools for genotyping Mycobacterium tuberculosis isolates in epidemiological surveys in order to identify clustered and orphan strains requires faster response times than those offered by the reference method, IS6110 restriction fragment length polymorphism (RFLP) genotyping. A method based on PCR, the mycobacterial interspersed repetitive-unit-variable-number tandem-repeat (MIRU-VNTR) genotyping technique, is an option for fast fingerprinting of M. tuberculosis, although precise evaluations of correlation between MIRU-VNTR and RFLP findings in population-based studies in different contexts are required before the methods are switched. In this study, we evaluated MIRU-VNTR genotyping (with a set of 15 loci [MIRU-15]) in parallel to RFLP genotyping in a 39-month universal population-based study in a challenging setting with a high proportion of immigrants. For 81.9% (281/343) of the M. tuberculosis isolates, both RFLP and MIRU-VNTR types were obtained. The percentages of clustered cases were 39.9% (112/281) and 43.1% (121/281) for RFLP and MIRU-15 analyses, and the numbers of clusters identified were 42 and 45, respectively. For 85.4% of the cases, the RFLP and MIRU-15 results were concordant, identifying the same cases as clustered and orphan (kappa, 0.7). However, for the remaining 14.6% of the cases, discrepancies were observed: 16 of the cases clustered by RFLP analysis were identified as orphan by MIRU-15 analysis, and 25 cases identified as orphan by RFLP analysis were clustered by MIRU-15 analysis. When discrepant cases showing subtle genotypic differences were tolerated, the discrepancies fell from 14.6% to 8.6%. Epidemiological links were found for 83.8% of the cases clustered by both RFLP and MIRU-15 analyses, whereas for the cases clustered by RFLP or MIRU-VNTR analysis alone, links were identified for only 30.8% or 38.9% of the cases, respectively. The latter group of cases mainly comprised isolates that could also have been clustered, if subtle genotypic differences had been tolerated. MIRU-15 genotyping seems to be a good alternative to RFLP genotyping for real-time interventional schemes. The correlation between MIRU-15 and IS6110 RFLP findings was reasonable, although some uncertainties as to the assignation of clusters by MIRU-15 analysis were identified.
Resumo:
BACKGROUND: In this study we compared the immunogenicity of influenza vaccine administered intradermally to the standard intramuscular vaccination in lung transplant recipients. METHODS: Patients were randomized to receive the trivalent inactivated seasonal 2008-9 influenza vaccine containing either 6 μg (intradermal) or 15 μg (intramuscular) of hemagglutinin per viral strain. Immunogenicity was assessed by measurement of geometric mean titer of antibodies using the hemagglutination-inhibition (HI) assay. Vaccine response was defined as a 4-fold or higher increase of antibody titers to at least one vaccine antigen. RESULTS: Eighty-five patients received either the intradermal (n = 41) or intramuscular (n = 44) vaccine. Vaccine response was seen in 6 of 41 patients (14.6%) in the intradermal vs 8 of 43 (18.6%) in the intramuscular group (p = 0.77). Seroprotection (HI ≥1:32) was 39% for H1N1, 83% for H3N2 and 29% for B strain in the intradermal group vs 28% for H1N1, 98% for H3N2 and 58% for B strain in the intramuscular group (p = 0.36 for H1N1, p = 0.02 for H3N2, p < 0.01 for B). Mild adverse events were seen in 44% of patients in the intradermal group and 34% in the intramuscular group (p = 0.38). CONCLUSIONS: Immunogenicity of the 2008-9 influenza vaccine given intradermally or intramuscularly was overall poor in lung transplant recipients. Novel strategies for influenza vaccination in this population are needed.
Resumo:
BACKGROUND. Either higher levels of initial DNA damage or lower levels of radiation-induced apoptosis in peripheral blood lymphocytes have been associated to increased risk for develop late radiation-induced toxicity. It has been recently published that these two predictive tests are inversely related. The aim of the present study was to investigate the combined role of both tests in relation to clinical radiation-induced toxicity in a set of breast cancer patients treated with high dose hyperfractionated radical radiotherapy. METHODS. Peripheral blood lymphocytes were taken from 26 consecutive patients with locally advanced breast carcinoma treated with high-dose hyperfractioned radical radiotherapy. Acute and late cutaneous and subcutaneous toxicity was evaluated using the Radiation Therapy Oncology Group morbidity scoring schema. The mean follow-up of survivors (n = 13) was 197.23 months. Radiosensitivity of lymphocytes was quantified as the initial number of DNA double-strand breaks induced per Gy and per DNA unit (200 Mbp). Radiation-induced apoptosis (RIA) at 1, 2 and 8 Gy was measured by flow cytometry using annexin V/propidium iodide. RESULTS. Mean DSB/Gy/DNA unit obtained was 1.70 ± 0.83 (range 0.63-4.08; median, 1.46). Radiation-induced apoptosis increased with radiation dose (median 12.36, 17.79 and 24.83 for 1, 2, and 8 Gy respectively). We observed that those "expected resistant patients" (DSB values lower than 1.78 DSB/Gy per 200 Mbp and RIA values over 9.58, 14.40 or 24.83 for 1, 2 and 8 Gy respectively) were at low risk of suffer severe subcutaneous late toxicity (HR 0.223, 95%CI 0.073-0.678, P = 0.008; HR 0.206, 95%CI 0.063-0.677, P = 0.009; HR 0.239, 95%CI 0.062-0.929, P = 0.039, for RIA at 1, 2 and 8 Gy respectively) in multivariate analysis. CONCLUSIONS. A radiation-resistant profile is proposed, where those patients who presented lower levels of initial DNA damage and higher levels of radiation induced apoptosis were at low risk of suffer severe subcutaneous late toxicity after clinical treatment at high radiation doses in our series. However, due to the small sample size, other prospective studies with higher number of patients are needed to validate these results.
Resumo:
OBJECTIVE: Whether or not a high risk of falls increases the risk of bleeding in patients receiving anticoagulants remains a matter of debate. METHODS: We conducted a prospective cohort study involving 991 patients ≥65 years of age who received anticoagulants for acute venous thromboembolism (VTE) at nine Swiss hospitals between September 2009 and September 2012. The study outcomes were as follows: the time to a first major episode of bleeding; and clinically relevant nonmajor bleeding. We determined the associations between the risk of falls and the time to a first episode of bleeding using competing risk regression, accounting for death as a competing event. We adjusted for known bleeding risk factors and anticoagulation as a time-varying covariate. RESULTS: Four hundred fifty-eight of 991 patients (46%) were at high risk of falls. The mean duration of follow-up was 16.7 months. Patients at high risk of falls had a higher incidence of major bleeding (9.6 vs. 6.6 events/100 patient-years; P = 0.05) and a significantly higher incidence of clinically relevant nonmajor bleeding (16.7 vs. 8.3 events/100 patient-years; P < 0.001) than patients at low risk of falls. After adjustment, a high risk of falls was associated with clinically relevant nonmajor bleeding [subhazard ratio (SHR) = 1.74, 95% confidence interval (CI) = 1.23-2.46], but not with major bleeding (SHR = 1.24, 95% CI = 0.83-1.86). CONCLUSION: In elderly patients who receive anticoagulants because of VTE, a high risk of falls is significantly associated with clinically relevant nonmajor bleeding, but not with major bleeding. Whether or not a high risk of falls is a reason against providing anticoagulation beyond 3 months should be based on patient preferences and the risk of VTE recurrence.
Resumo:
This study examined the validity and reliability of a sequential "Run-Bike-Run" test (RBR) in age-group triathletes. Eight Olympic distance (OD) specialists (age 30.0 ± 2.0 years, mass 75.6 ± 1.6 kg, run VO2max 63.8 ± 1.9 ml· kg(-1)· min(-1), cycle VO2peak 56.7 ± 5.1 ml· kg(-1)· min(-1)) performed four trials over 10 days. Trial 1 (TRVO2max) was an incremental treadmill running test. Trials 2 and 3 (RBR1 and RBR2) involved: 1) a 7-min run at 15 km· h(-1) (R1) plus a 1-min transition to 2) cycling to fatigue (2 W· kg(-1) body mass then 30 W each 3 min); 3) 10-min cycling at 3 W· kg(-1) (Bsubmax); another 1-min transition and 4) a second 7-min run at 15 km· h(-1) (R2). Trial 4 (TT) was a 30-min cycle - 20-min run time trial. No significant differences in absolute oxygen uptake (VO2), heart rate (HR), or blood lactate concentration ([BLA]) were evidenced between RBR1 and RBR2. For all measured physiological variables, the limits of agreement were similar, and the mean differences were physiologically unimportant, between trials. Low levels of test-retest error (i.e. ICC <0.8, CV<10%) were observed for most (logged) measurements. However [BLA] post R1 (ICC 0.87, CV 25.1%), [BLA] post Bsubmax (ICC 0.99, CV 16.31) and [BLA] post R2 (ICC 0.51, CV 22.9%) were least reliable. These error ranges may help coaches detect real changes in training status over time. Moreover, RBR test variables can be used to predict discipline specific and overall TT performance. Cycle VO2peak, cycle peak power output, and the change between R1 and R2 (deltaR1R2) in [BLA] were most highly related to overall TT distance (r = 0.89, p < 0. 01; r = 0.94, p < 0.02; r = 0.86, p < 0.05, respectively). The percentage of TR VO2max at 15 km· h(-1), and deltaR1R2 HR, were also related to run TT distance (r = -0.83 and 0.86, both p < 0.05).
Resumo:
BACKGROUND Challenges exist in the clinical diagnosis of drug-induced liver injury (DILI) and in obtaining information on hepatotoxicity in humans. OBJECTIVE (i) To develop a unified list that combines drugs incriminated in well vetted or adjudicated DILI cases from many recognized sources and drugs that have been subjected to serious regulatory actions due to hepatotoxicity; and (ii) to supplement the drug list with data on reporting frequencies of liver events in the WHO individual case safety report database (VigiBase). DATA SOURCES AND EXTRACTION (i) Drugs identified as causes of DILI at three major DILI registries; (ii) drugs identified as causes of drug-induced acute liver failure (ALF) in six different data sources, including major ALF registries and previously published ALF studies; and (iii) drugs identified as being subjected to serious governmental regulatory actions due to their hepatotoxicity in Europe or the US were collected. The reporting frequency of adverse events was determined using VigiBase, computed as Empirical Bayes Geometric Mean (EBGM) with 90% confidence interval for two customized terms, 'overall liver injury' and 'ALF'. EBGM of >or=2 was considered a disproportional increase in reporting frequency. The identified drugs were then characterized in terms of regional divergence, published case reports, serious regulatory actions, and reporting frequency of 'overall liver injury' and 'ALF' calculated from VigiBase. DATA SYNTHESIS After excluding herbs, supplements and alternative medicines, a total of 385 individual drugs were identified; 319 drugs were identified in the three DILI registries, 107 from the six ALF registries (or studies) and 47 drugs that were subjected to suspension or withdrawal in the US or Europe due to their hepatotoxicity. The identified drugs varied significantly between Spain, the US and Sweden. Of the 319 drugs identified in the DILI registries of adjudicated cases, 93.4% were found in published case reports, 1.9% were suspended or withdrawn due to hepatotoxicity and 25.7% were also identified in the ALF registries/studies. In VigiBase, 30.4% of the 319 drugs were associated with disproportionally higher reporting frequency of 'overall liver injury' and 83.1% were associated with at least one reported case of ALF. CONCLUSIONS This newly developed list of drugs associated with hepatotoxicity and the multifaceted analysis on hepatotoxicity will aid in causality assessment and clinical diagnosis of DILI and will provide a basis for further characterization of hepatotoxicity.
Resumo:
Chronic renal failure is commonly related to hyponutrition, affecting approximately on third of patients with advanced renal failure. We carried out a longitudinal study to assess nutritional evolution of 73 patients on a regular hemodialysis program, assessing changes in the anthropometrical parameter body mass index (BMI) and its correspondence to biochemical nutritional parameters such as total protein (TP) levels and serum albumin (Alb). Every three months plasma TP and albumin levels were collected and BMI was calculated by the standard formula: post-dialysis weight in kg/height in m2. For classifying by BMI categories, overweight and low weight were defined according to the WHO Expert Committee. Studied patients had a mean age of 53 years, 43 were male and 30 were female patients. BMI in women was lower than that in men (p < 0.001), as well as TP (p < 0.001) and Alb (p < 0.001) levels. Mean BMI was 29.3 kg/m2. Three point two percent of the determinations showed low weight, 12.16% overweight, and 83.97% normal BMI. TP were normal in 90.76% and decreased in 9.24%. Alb was normal in 82.2% and low in 17.78%. After the follow-up time (21.6 months, minimum 18 months, maximum 53 months), the Kruskal-Wallis test did not show a statistically significant change for BMI but it did show a change for the biochemical parameters albumin and total proteins (p < 0.05): nutritional impairment in CRF patients is manifested on biochemical parameters (TP and Alb) with no reflection on anthropometrical data.
Resumo:
A child with clinical features associated a trisomy for the distal part of 9q was shown to have the following abnormal chromosome complement : 47,XY,+t)X;9) (Xpter yields Xq24:9q31 yields 9qter), inv 9(p11q13), var 14 (14pQFQ34).
Resumo:
INTRODUCTION According to several series, hospital hyponutrition involves 30-50% of hospitalized patients. The high prevalence justifies the need for early detection from admission. There several classical screening tools that show important limitations in their systematic application in daily clinical practice. OBJECTIVES To analyze the relationship between hyponutrition, detected by our screening method, and mortality, hospital stay, or re-admissions. To analyze, as well, the relationship between hyponutrition and prescription of nutritional support. To compare different nutritional screening methods at admission on a random sample of hospitalized patients. Validation of the INFORNUT method for nutritional screening. MATERIAL AND METHODS In a previous phase from the study design, a retrospective analysis with data from the year 2003 was carried out in order to know the situation of hyponutrition in Virgen de la Victoria Hospital, at Malaga, gathering data from the MBDS (Minimal Basic Data Set), laboratory analysis of nutritional risk (FILNUT filter), and prescription of nutritional support. In the experimental phase, a cross-sectional cohort study was done with a random sample of 255 patients, on May of 2004. Anthropometrical study, Subjective Global Assessment (SGA), Mini-Nutritional Assessment (MNA), Nutritional Risk Screening (NRS), Gassull's method, CONUT and INFORNUT were done. The settings of the INFORNUT filter were: albumin < 3.5 g/dL, and/or total proteins <5 g/dL, and/or prealbumin <18 mg/dL, with or without total lymphocyte count < 1.600 cells/mm3 and/or total cholesterol <180 mg/dL. In order to compare the different methods, a gold standard is created based on the recommendations of the SENPE on anthropometrical and laboratory data. The statistical association analysis was done by the chi-squared test (a: 0.05) and agreement by the k index. RESULTS In the study performed in the previous phase, it is observed that the prevalence of hospital hyponutrition is 53.9%. One thousand six hundred and forty four patients received nutritional support, of which 66.9% suffered from hyponutrition. We also observed that hyponutrition is one of the factors favoring the increase in mortality (hyponourished patients 15.19% vs. non-hyponourished 2.58%), hospital stay (hyponourished patients 20.95 days vs. non-hyponourished 8.75 days), and re-admissions (hyponourished patients 14.30% vs. non-hyponourished 6%). The results from the experimental study are as follows: the prevalence of hyponutrition obtained by the gold standard was 61%, INFORNUT 60%. Agreement levels between INFORNUT, CONUT, and GASSULL are good or very good between them (k: 0.67 INFORNUT with CONUT, and k: 0.94 INFORNUT and GASSULL) and wit the gold standard (k: 0.83; k: 0.64 CONUT; k: 0.89 GASSULL). However, structured tests (SGA, MNA, NRS) show low agreement indexes with the gold standard and laboratory or mixed tests (Gassull), although they show a low to intermediate level of agreement when compared one to each other (k: 0.489 NRS with SGA). INFORNUT shows sensitivity of 92.3%, a positive predictive value of 94.1%, and specificity of 91.2%. After the filer phase, a preliminary report is sent, on which anthropometrical and intake data are added and a Nutritional Risk Report is done. CONCLUSIONS Hyponutrition prevalence in our study (60%) is similar to that found by other authors. Hyponutrition is associated to increased mortality, hospital stay, and re-admission rate. There are no tools that have proven to be effective to show early hyponutrition at the hospital setting without important applicability limitations. FILNUT, as the first phase of the filter process of INFORNUT represents a valid tool: it has sensitivity and specificity for nutritional screening at admission. The main advantages of the process would be early detection of patients with risk for hyponutrition, having a teaching and sensitization function to health care staff implicating them in nutritional assessment of their patients, and doing a hyponutrition diagnosis and nutritional support need in the discharge report that would be registered by the Clinical Documentation Department. Therefore, INFORNUT would be a universal screening method with a good cost-effectiveness ratio.
Resumo:
BACKGROUND AND AIMS: Ficolin-2 is an acute phase reactant produced by the liver and targeted to recognize N-acetyl-glucosamine which is present in bacterial and fungal cell walls. We recently showed that ficolin-2 serum levels were significantly higher in CD patients compared to healthy controls. We aimed to evaluate serum ficolin-2 concentrations in CD patients regarding their correlation with endoscopic severity and to compare them with clinical activity, fecal calprotectin, and CRP. METHODS: Patients provided fecal and blood samples before undergoing ileo-colonoscopy. Disease activity was scored clinically according to the Harvey-Bradshaw Index (HBI) and endoscopically according to the simplified endoscopic score for CD (SES-CD). Ficolin-2 serum levels and fecal calprotectin levels were measured by ELISA. RESULTS: A total of 136 CD patients were prospectively included (mean age at inclusion 41.5±15.4 years, 37.5% females). Median HBI was 3 [2-6] points, median SES-CD was 5 [2-8], median fecal calprotectin was 301 [120-703] μg/g, and median serum ficolin-2 was 2.69 [2.02-3.83] μg/mL. SES-CD correlated significantly with calprotectin (R=0.676, P<0.001), CRP (R=0.458, P<0.001), HBI (R=0.385, P<0.001), and serum ficolin-2 levels (R=0.171, P=0.047). Ficolin-2 levels were higher in CD patients with mild endoscopic disease compared to patients in endoscopic remission (P=0.015) but no difference was found between patients with mild, moderate, and severe endoscopic disease. CONCLUSIONS: Ficolin-2 serum levels correlate worse with endoscopic CD activity when compared to fecal calprotectin or CRP.
Resumo:
BACKGROUND: The ideal local anesthetic regime for femoral nerve block that balances analgesia with mobility after total knee arthroplasty (TKA) remains undefined. QUESTIONS/PURPOSES: We compared two volumes and concentrations of a fixed dose of ropivacaine for continuous femoral nerve block after TKA to a single injection femoral nerve block with ropivacaine to determine (1) time to discharge readiness; (2) early pain scores and analgesic consumption; and (3) functional outcomes, including range of motion and WOMAC scores at the time of recovery. METHODS: Ninety-nine patients were allocated to one of three continuous femoral nerve block groups for this randomized, placebo-controlled, double-blind trial: a high concentration group (ropivacaine 0.2% infusion), a low concentration group (ropivacaine 0.1% infusion), or a placebo infusion group (saline 0.9% infusion). Infusions were discontinued on postoperative Day (POD) 2. The primary outcome was time to discharge readiness. Secondary outcomes included opioid consumption, pain, and functional outcomes. Ninety-three patients completed the study protocol; the study was halted early because of unanticipated changes to pain protocols at the host institution, by which time only 61% of the required number of patients had been enrolled. RESULTS: With the numbers available, the mean time to discharge readiness was not different between groups (high concentration group, 62 hours [95% confidence interval [CI], 51-72 hours]; low concentration group, 73 hours [95% CI, 63-83 hours]; placebo infusion group 65 hours [95% CI, 56-75 hours]; p = 0.27). Patients in the low concentration group consumed significantly less morphine during the period of infusion (POD 1, high concentration group, 56 mg [95% CI, 42-70 mg]; low concentration group, 35 mg [95% CI, 27-43 mg]; placebo infusion group, 48 mg [95% CI, 38-59 mg], p = 0.02; POD 2, high concentration group, 50 mg [95% CI, 41-60 mg]; low concentration group, 33 mg [95% CI, 24-42 mg]; placebo infusion group, 39 mg [95% CI, 30-48 mg], p = 0.04); however, there were no important differences in pain scores or opioid-related side effects with the numbers available. Likewise, there were no important differences in functional outcomes between groups. CONCLUSIONS: Based on this study, which was terminated prematurely before the desired sample size could be achieved, we were unable to demonstrate that varying the concentration and volume of a fixed-dose ropivacaine infusion for continuous femoral nerve block influences time to discharge readiness when compared with a conventional single-injection femoral nerve block after TKA. A low concentration of ropivacaine infusion can reduce postoperative opioid consumption but without any important differences in pain scores, side effects, or functional outcomes. These pilot data may be used to inform the statistical power of future randomized trials. LEVEL OF EVIDENCE: Level II, therapeutic study. See Guidelines for Authors for a complete description of levels of evidence.
Resumo:
A cross-sectional clinical trial in which the serum anti-phenolic glycolipid (anti-PGL-1) antibodies were analysed in household contacts (HHC) of patients with leprosy as an adjunct early leprosy diagnostic marker was conducted. The families of 83 patients underwent clinical examination and serum anti-PGL1 measurement using enzyme-linked immunosorbent assay. Of 320 HHC, 98 were contacts of lepromatous leprosy (LL), 80 were contacts of borderline lepromatous (BL), 28 were contacts of borderline (BB) leprosy, 54 were contacts of borderline tuberculoid (BT), 40 were contacts of tuberculoid (TT) and 20 were contacts of indeterminate (I) leprosy. Consanguinity with the patients was determined for 232 (72.5%) HHC. Of those 232 contacts, 183 had linear consanguinity. Forty-nine HHC had collateral consanguinity. Fifty-eight contacts (18.1%) tested positive for anti-PGL1 antibodies. The number of seropositive contacts based on the clinical forms of the index case was 17 (29.3%) for LL, 15 (25.9%) for BL, one (1.7%) for BB, 14 (24.1%) for BT, three (5.2%) for TT and eight (13.7%) for I. At the one year follow-up, two (3.4%) of these seropositive contacts had developed BT leprosy. The results of the present study indicate that the serum anti-PGL-1 IgM antibody may be useful for evaluating antigen exposure and as a tool for an early leprosy diagnosis in HHC.