54 resultados para Case Control Studies
Resumo:
Early studies in patients with systemic lupus erythematosus (SLE) reported increased incidence of tuberculosis. The tuberculin skin test (TST) is the technique of choice to detect latent tuberculosis infection (LTBI) but has several limitations. OBJECTIVES We compared TST and the newer T.SPOT.TB test to diagnose LTBI in SLE patients. METHODS In this observational cohort study conducted between August 2009 and February 2012, we recruited 92 patients from those attending the SLE clinic of our university hospital. Data recorded were epidemiological and sociodemographic characteristics. Laboratory analyses included TST and T.SPOT.TB tests. RESULTS Of the patients studied, 92% were women with an average age of 42.7 years. Overall, the degree of correlation between the two tests was low (Kappa index = 0.324) but was better in patients not receiving corticosteroids (CTC)/immunosuppressive (IS) therapy (Kappa = 0.436) and in those receiving hydroxychloroquine (Kappa = 0.473). While TST results were adversely affected by those receiving CTC and/or IS drugs (P = 0.021), the T.SPOT.TB results were not. CONCLUSION Although the TST test remains a useful tool for diagnosing LTBI in SLE patients, the T.SPOT.TB test is perhaps better employed when the patient is receiving CTC and/or IS drugs.
Resumo:
There are very few disease-specific studies focusing on outcomes of umbilical cord blood transplantation for Philadelphia chromosome-positive acute lymphoblastic leukemia. We report the outcome of 45 patients with Philadelphia chromosome-positive acute lymphoblastic leukemia who underwent myeloablative single unit cord blood transplantation from unrelated donors within the GETH/GITMO cooperative group. Conditioning regimens were based on combinations of thiotepa, busulfan, cyclophospamide or fludarabine, and antithymocyte globulin. At the time of transplantation, 35 patients (78%) were in first complete remission, four (8%) in second complete remission and six (14%) in third or subsequent response. The cumulative incidence of myeloid engraftment was 96% at a median time of 20 days and significantly better for patients receiving higher doses of CD34(+) cells. The incidence of acute grade II-IV graft-versus-host disease was 31%, while that of overall chronic graft-versus-host disease was 53%. Treatment-related mortality was 17% at day +100 and 31% at 5 years. The 5-year relapse, event-free survival and overall survival rates were 31%, 36% and 44%, respectively. Although the event-free and overall survival rates in patients without BCR/ABL transcripts detectable at time of transplant were better than those in whom BCR/ABL transcripts were detected (46% versus 24% and 60% versus 30%, respectively) these differences were not statistically significant in the univariate analysis (P=0.07). These results demonstrate that umbilical cord blood transplantation from unrelated donors can be a curative treatment for a substantial number of patients with Philadelphia chromosome-positive acute lymphoblastic leukemia.
Resumo:
BACKGROUND & AIMS Hy's Law, which states that hepatocellular drug-induced liver injury (DILI) with jaundice indicates a serious reaction, is used widely to determine risk for acute liver failure (ALF). We aimed to optimize the definition of Hy's Law and to develop a model for predicting ALF in patients with DILI. METHODS We collected data from 771 patients with DILI (805 episodes) from the Spanish DILI registry, from April 1994 through August 2012. We analyzed data collected at DILI recognition and at the time of peak levels of alanine aminotransferase (ALT) and total bilirubin (TBL). RESULTS Of the 771 patients with DILI, 32 developed ALF. Hepatocellular injury, female sex, high levels of TBL, and a high ratio of aspartate aminotransferase (AST):ALT were independent risk factors for ALF. We compared 3 ways to use Hy's Law to predict which patients would develop ALF; all included TBL greater than 2-fold the upper limit of normal (×ULN) and either ALT level greater than 3 × ULN, a ratio (R) value (ALT × ULN/alkaline phosphatase × ULN) of 5 or greater, or a new ratio (nR) value (ALT or AST, whichever produced the highest ×ULN/ alkaline phosphatase × ULN value) of 5 or greater. At recognition of DILI, the R- and nR-based models identified patients who developed ALF with 67% and 63% specificity, respectively, whereas use of only ALT level identified them with 44% specificity. However, the level of ALT and the nR model each identified patients who developed ALF with 90% sensitivity, whereas the R criteria identified them with 83% sensitivity. An equal number of patients who did and did not develop ALF had alkaline phosphatase levels greater than 2 × ULN. An algorithm based on AST level greater than 17.3 × ULN, TBL greater than 6.6 × ULN, and AST:ALT greater than 1.5 identified patients who developed ALF with 82% specificity and 80% sensitivity. CONCLUSIONS When applied at DILI recognition, the nR criteria for Hy's Law provides the best balance of sensitivity and specificity whereas our new composite algorithm provides additional specificity in predicting the ultimate development of ALF.
Resumo:
Previously published scientific papers have reported a negative correlation between drinking water hardness and cardiovascular mortality. Some ecologic and case-control studies suggest the protective effect of calcium and magnesium concentration in drinking water. In this article we present an analysis of this protective relationship in 538 municipalities of Comunidad Valenciana (Spain) from 1991-1998. We used the Spanish version of the Rapid Inquiry Facility (RIF) developed under the European Environment and Health Information System (EUROHEIS) research project. The strategy of analysis used in our study conforms to the exploratory nature of the RIF that is used as a tool to obtain quick and flexible insight into epidemiologic surveillance problems. This article describes the use of the RIF to explore possible associations between disease indicators and environmental factors. We used exposure analysis to assess the effect of both protective factors--calcium and magnesium--on mortality from cerebrovascular (ICD-9 430-438) and ischemic heart (ICD-9 410-414) diseases. This study provides statistical evidence of the relationship between mortality from cardiovascular diseases and hardness of drinking water. This relationship is stronger in cerebrovascular disease than in ischemic heart disease, is more pronounced for women than for men, and is more apparent with magnesium than with calcium concentration levels. Nevertheless, the protective nature of these two factors is not clearly established. Our results suggest the possibility of protectiveness but cannot be claimed as conclusive. The weak effects of these covariates make it difficult to separate them from the influence of socioeconomic and environmental factors. We have also performed disease mapping of standardized mortality ratios to detect clusters of municipalities with high risk. Further standardization by levels of calcium and magnesium in drinking water shows changes in the maps when we remove the effect of these covariates.
Resumo:
BACKGROUND In 1997, 18.5% of the cases of Meningococcal Disease caused b serogroup C in Andalusia were children between 2 and 4 years of age; ages where the initial immune response and the duration of the capsular A + C meningococcal polysaccharide vaccine is less than in older age groups. Research was designed in order to measure the immune response produced by this vaccine in children from 2 to 6 years of age and to compare it with the natural immunity present in unvaccinated children. METHODS I. Dual monitoring study: a) groups of children vaccinated previously and control groups, b) groups of children who were going to be vaccinated, for pre and post-vaccination (1, 6 and 12 months) analysis and a control group. II. The bactericidal activity was measured according to the standardised protocol of the CDC with regard to the strain of N. meningitidis C-11. The sera with bactericidal activity (TAB) > 1:8 were considered to be protective. RESULTS 1 and 2 months following vaccination, the proportion of TAB > 1:8 was significantly higher than that of the control group (65.6% and 73% in comparison to 2.2% and 12%). In the pre-vaccine and post-vaccine (after 6, 7, 12 and 13 months) verification, no significant difference between vaccinated individuals and controls was observed. CONCLUSIONS The differences between vaccinated and unvaccinated individuals 1 and 2 months following vaccination indicate seroconversion in the vaccinated individuals. For the age group of between 2 to 6 years of age, the bactericidal activity acquired decline quickly, as, after 6 months, differences between this group and the control group are no longer observed.
Resumo:
BACKGROUND Based on the mechanism of action, combining somatostatin analogues (SSAs) with mTOR inhibitors or antiangiogenic agents may provide synergistic effects for the treatment of patients with neuroendocrine tumours (NETs). Herein, we investigate the use of these treatment combinations in clinical practice. METHODS This retrospective cross-sectional analysis of patients with NETs treated with the SSA lanreotide and targeted therapies at 35 Spanish hospitals evaluated the efficacy and safety of lanreotide treatment combinations in clinical practice. The data of 159 treatment combinations with lanreotide in 133 patients was retrospectively collected. RESULTS Of the 133 patients, with a median age of 59.4 (16-83) years, 70 (52.6 %) patients were male, 64 (48.1 %) had pancreatic NET, 23 (17.3 %) had ECOG PS ≥2, 41 (30.8 %) had functioning tumours, 63 (47.7 %) underwent surgery of the primary tumour, 45 (33.8 %) had received prior chemotherapy, and 115 (86.5 %) had received prior SSAs. 115 patients received 1 lanreotide treatment combination and 18 patients received between 2 and 5 combinations. Lanreotide was mainly administered in combination with everolimus (73 combinations) or sunitinib (61 combinations). The probability of being progression-free was 78.5 % (6 months), 68.6 % (12 months) and 57.0 % (18 months) for patients who only received everolimus plus lanreotide (n = 57) and 89.3 % (6 months), 73.0 % (12 months), and 67.4 % (18 months) for patients who only received sunitinib and lanreotide (n = 50). In patients who only received everolimus plus lanreotide the median time-to-progression from the initiation of lanreotide combination treatment was 25.8 months (95 % CI, 11.3, 40.3) and it had not yet been reached among the subgroup of patients only receiving sunitinib plus lanreotide. The safety profile of the combination treatment was comparable to that of the targeted agent alone. CONCLUSIONS The combination of lanreotide and targeted therapies, mainly everolimus and sunitinib, is widely used in clinical practice without unexpected toxicities and suggests efficacy that should be explored in randomized prospective clinical trials.
Resumo:
AIMTo assess the double-balloon enteroscopy (DBE) role in malignant small bowel tumors (MSBT). METHODS This is a retrospective descriptive study performed in a single center. All consecutive patients who underwent a DBE with final diagnosis of a malignant neoplasm from 2004 to 2014 in our referral center were included. Patient demographic and clinical pathological characteristics were recorded and reviewed. MSBT diagnosis was achieved either by DBE directed biopsy with multiple tissue sampling, endoscopic findings or histological analysis of surgical specimen. We have analyzed double-balloon enteroscopy impact in outcome and clinical course of these patients. RESULTS Of 627 patients, 28 (4.5%) (mean age = 60 ± 17.3 years) underwent 30 procedures (25 anterograde, 5 retrograde) and were diagnosed of a malignant tumor. Patients presented with obscure gastrointestinal bleeding (n = 19, 67.9%), occlusion syndrome (n = 7, 25%) and diarrhea (n = 1, 3.6%). They were diagnosed by DBE biopsy (n = 18, 64.3%), histological analysis of surgical specimen (n = 7, 25%) and unequivocal endoscopic findings (n = 2, 7.1%). Gastrointestinal stromal tumor (n = 8, 28.6%), adenocarcinoma (n = 7, 25%), lymphoma (n = 4, 14.3%), neuroendocrine tumor (n = 4, 14.3%), metastatic (n = 3, 10.7%) and Kaposi sarcoma (n = 1, 3.6%) were identified. DBE modified outcome in 7 cases (25%), delaying or avoiding emergency surgery (n = 3), modifying surgery approach (n = 2) and indicating emergency SB partial resection instead of elective approach (n = 2). CONCLUSION DBE may be critical in the management of MSBT providing additional information that may be decisive in the clinical course of these patients.
Resumo:
Background: To describe the overall and disease-free survival at five and ten years after breast cancer diagnosis in women from a previous case-control study, and establish related prognostic factors. Methods: We followed up 202 patients diagnosed between 1996 and 1998 in three public hospitals in Granada and Almeria provinces in Spain. Survival rates were calculated using the Kaplan and Meier method, and the Cox proportional hazards model was applied to identify the most significant variables contributing to survival. Results: Mean age at diagnosis was 54.27±10.4 years. Mean follow-up for overall survival was 119.91 months (95%CI 113.65126.17); the five-year survival rate was 83.9% (95%CI: 78.13-89.66) and the ten-year rate was 71% (95%CI: 63.25-78.74). Mean followup for disease-free survival was 118.75 months (95%CI 111.86125.65); the five-year disease-free survival rate was 81% (95%CI: 74.52-87.47) and the ten-year rate was 71.3% (95%CI: 63.33-79.26). The mortality rate of the study population was 33.17%. Conclusions: Disease characteristics are similar in our population to those in other Spanish and European regions, while the overall survival is higher than the mean rate during the same period in Europe (5-yr rate of 79%) and similar to that in Spain (83%).
Resumo:
BACKGROUND: Extended-spectrum beta-lactamase (ESBL)-producing members of the Enterobacteriaceae family are important nosocomial pathogens. Escherichia coli producing a specific family of ESBL (the CTX-M enzymes) are emerging worldwide. The epidemiology of these organisms as causes of nosocomial infection is poorly understood. The aims of this study were to investigate the clinical and molecular epidemiology of nosocomial infection or colonization due to ESBL-producing E. coli in hospitalized patients, consider the specific types of ESBLs produced, and identify the risk factors for infection and colonization with these organisms. METHODS: All patients with nosocomial colonization and/or infection due to ESBL-producing E. coli in 2 centers (a tertiary care hospital and a geriatric care center) identified between January 2001 and May 2002 were included. A double case-control study was performed. The clonal relatedness of the isolates was studied by repetitive extragenic palindromic-polymerase chain reaction and pulsed-field gel electrophoresis. ESBLs were characterized by isoelectric focusing, polymerase chain reaction, and sequencing. RESULTS: Forty-seven case patients were included. CTX-M-producing E. coli were clonally unrelated and more frequently susceptible to nonoxyimino-beta-lactams. Alternately, isolates producing SHV- and TEM-type ESBL were epidemic and multidrug resistant. Urinary catheterization was a risk factor for both CTX-M-producing and SHV-TEM-producing isolates. Previous oxyimino-beta-lactam use, diabetes, and ultimately fatal or nonfatal underlying diseases were independent risk factors for infection or colonization with CTX-M-producing isolates, whereas previous fluoroquinolone use was associated with infection or colonization with SHV-TEM-producing isolates. CONCLUSIONS: The epidemiology of ESBL-producing E. coli as a cause of nosocomial infection is complex. Sporadic CTX-M-producing isolates coexisted with epidemic multidrug-resistant SHV-TEM-producing isolates. These data should be taken into account for the design of control measures.