883 resultados para Disease, control
Resumo:
BACKGROUND: Controlled studies established the efficacy and good tolerability of pimecrolimus cream 1% for the treatment of atopic dermatitis but they may not reflect real-life use. OBJECTIVE: To evaluate the efficacy, tolerability and cosmetic acceptance of a pimecrolimus-based regimen in daily practice in Switzerland. METHODS: This was a 6-month, open-label, multicentre study in 109 patients (55% > or = 18 years) with atopic dermatitis. Pimecrolimus cream 1% was incorporated into patients' standard treatment protocols. RESULTS: The pimecrolimus-based treatment was well tolerated and produced disease improvement in 65.7% of patients. It was particularly effective on the face (improvement rate: 75.0%). Mean pimecrolimus consumption decreased from 6.4 g/day (months 1-3) to 4.0 g/day (months 3-6) as disease improved. Most patients (74.1%) rated their disease control as 'complete' or 'good' and 90% were highly satisfied with the cream formulation. CONCLUSION: The use of a pimecrolimus-based regimen in everyday practice was effective, well tolerated and well accepted by patients.
Resumo:
BACKGROUND: Human immunodeficiency virus (HIV)-infected persons may be at increased risk for developing type 2 diabetes mellitus because of viral coinfection and adverse effects of treatment. METHODS: We studied associations of new-onset diabetes mellitus with hepatitis B virus and hepatitis C virus coinfections and antiretroviral therapy in participants in the Swiss HIV Cohort Study, using Poisson regression. RESULTS: A total of 123 of 6513 persons experienced diabetes mellitus during 27,798 person-years of follow-up (PYFU), resulting in an incidence of 4.4 cases per 1000 PYFU (95% confidence interval [CI], 3.7-5.3 cases per 1000 PYFU). An increased incidence rate ratio (IRR) was found for male subjects (IRR, 2.5; 95% CI, 1.5-4.2), older age (IRR for subjects >60 years old, 4.3; 95% CI, 2.3-8.2), black (IRR, 2.1; 95% CI, 1.1-4.0) and Asian (IRR, 4.9; 95% CI, 2.2-10.9) ethnicity, Centers for Disease Control and Prevention disease stage C (IRR, 1.6; 95% CI, 1.04-2.4), and obesity (IRR, 4.7; 95% CI, 3.1-7.0), but results for hepatitis C virus infection or active hepatitis B virus infection were inconclusive. Strong associations were found for current treatment with nucleoside reverse-transcriptase inhibitors (IRR, 2.22; 95% CI, 1.11-4.45), nucleoside reverse-transcriptase inhibitors plus protease inhibitors (IRR, 2.48; 95% CI, 1.42-4.31), and nucleoside reverse-transcriptase inhibitors plus protease inhibitors and nonnucleoside reverse-transcriptase inhibitors (IRR, 3.25; 95% CI, 1.59-6.67) but were not found for treatment with nucleoside reverse-transcriptase inhibitors plus nonnucleoside reverse-transcriptase inhibitors (IRR, 1.47; 95% CI, 0.77-2.82). CONCLUSIONS: In addition to traditional risk factors, current treatment with protease inhibitor- and nucleoside reverse-transcriptase inhibitor-containing regimens was associated with the risk of developing type 2 diabetes mellitus. Our study did not find a significant association between viral hepatitis infection and risk of incident diabetes.
Resumo:
BACKGROUND: The CD4 T cell count recovery in human immunodeficiency virus type 1 (HIV-1)-infected individuals receiving potent antiretroviral therapy (ART) shows high variability. We studied the determinants and the clinical relevance of incomplete CD4 T cell restoration. METHODS: Longitudinal CD4 T cell count was analyzed in 293 participants of the Swiss HIV Cohort Study who had had a plasma HIV-1 RNA load <1000 copies/mL for > or =5 years. CD4 T cell recovery was stratified by CD4 T cell count 5 years after initiation of ART (> or =500 cells/microL was defined as a complete response, and <500 cells/microL was defined as an incomplete response). Determinants of incomplete responses and clinical events were evaluated using logistic regression and survival analyses. RESULTS: The median CD4 T cell count increased from 180 cells/microL at baseline to 576 cells/microL 5 years after ART initiation. A total of 35.8% of patients were incomplete responders, of whom 47.6% reached a CD4 T cell plateau <500 cells/microL. Centers for Disease Control and Prevention HIV-1 disease category B and/or C events occurred in 21% of incomplete responders and in 14.4% of complete responders (P>.05). Older age (adjusted odds ratio [aOR], 1.71 per 10-year increase; 95% confidence interval [CI], 1.21-2.43), lower baseline CD4 T cell count (aOR, 0.37 per 100-cell increase; 95% CI, 0.28-0.49), and longer duration of HIV infection (aOR, 2.39 per 10-year increase; 95% CI, 1.19-4.81) were significantly associated with a CD4 T cell count <500 cells/microL at 5 years. The median increases in CD4 T cell count after 3-6 months of ART were smaller in incomplete responders (P<.001) and predicted, in conjunction with baseline CD4 T cell count and age, incomplete response with 80% sensitivity and 72% specificity. CONCLUSION: Individuals with incomplete CD4 T cell recovery to <500 cells/microL had more advanced HIV-1 infection at baseline. CD4 T cell changes during the first 3-6 months of ART already reflect the capacity of the immune system to replenish depleted CD4 T lymphocytes.
Resumo:
BACKGROUND: To determine the activity and tolerability of adding cetuximab to the oxaliplatin and capecitabine (XELOX) combination in first-line treatment of metastatic colorectal cancer (MCC). PATIENTS AND METHODS: In a multicenter two-arm phase II trial, patients were randomized to receive oxaliplatin 130 mg/m(2) on day 1 and capecitabine 1000 mg/m(2) twice daily on days 1-14 every 3 weeks alone or in combination with standard dose cetuximab. Treatment was limited to a maximum of six cycles. RESULTS: Seventy-four patients with good performance status entered the trial. Objective partial response rates after external review and radiological confirmation were 14% and 41% in the XELOX and in the XELOX + Cetuximab arm, respectively. Stable disease has been observed in 62% and 35% of the patients, with 76% disease control in both arms. Cetuximab led to skin rash in 65% of the patients. The median overall survival was 16.5 months for arm A and 20.5 months for arm B. The median time to progression was 5.8 months for arm A and 7.2 months for arm B. CONCLUSION: Differences in response rates between the treatment arms indicate that cetuximab may improve outcome with XELOX. The correct place of the cetuximab, oxaliplatin and fluoropyrimidine combinations in first-line treatment of MCC has to be assessed in phase III trials.
Resumo:
OBJECTIVE: To obtain precise information on the optimal time window for surgical antimicrobial prophylaxis. SUMMARY BACKGROUND DATA: Although perioperative antimicrobial prophylaxis is a well-established strategy for reducing the risk of surgical site infections (SSI), the optimal timing for this procedure has yet to be precisely determined. Under today's recommendations, antibiotics may be administered within the final 2 hours before skin incision, ideally as close to incision time as possible. METHODS: In this prospective observational cohort study at Basel University Hospital we analyzed the incidence of SSI by the timing of antimicrobial prophylaxis in a consecutive series of 3836 surgical procedures. Surgical wounds and resulting infections were assessed to Centers for Disease Control and Prevention standards. Antimicrobial prophylaxis consisted in single-shot administration of 1.5 g of cefuroxime (plus 500 mg of metronidazole in colorectal surgery). RESULTS: The overall SSI rate was 4.7% (180 of 3836). In 49% of all procedures antimicrobial prophylaxis was administered within the final half hour. Multivariable logistic regression analyses showed a significant increase in the odds of SSI when antimicrobial prophylaxis was administered less than 30 minutes (crude odds ratio = 2.01; adjusted odds ratio = 1.95; 95% confidence interval, 1.4-2.8; P < 0.001) and 120 to 60 minutes (crude odds ratio = 1.75; adjusted odds ratio = 1.74; 95% confidence interval, 1.0-2.9; P = 0.035) as compared with the reference interval of 59 to 30 minutes before incision. CONCLUSIONS: When cefuroxime is used as a prophylactic antibiotic, administration 59 to 30 minutes before incision is more effective than administration during the last half hour.
Resumo:
OBJECTIVE: To investigate whether prolonged sacral neuromodulation (SNM) testing induces a substantial risk of infection because of the percutaneous passage of the extension wire. PATIENTS AND METHODS: A consecutive series of 20 patients with negative prolonged SNM testing for >or=14 days who underwent tined-lead explantation were prospectively evaluated. The explanted tined leads were sent for microbiological examination. The tined lead, gluteal, and extension wire incision sites were investigated for clinical signs of infection according to the Centers for Disease Control and Prevention classification system. RESULTS: In all, 17 patients had bilateral and three unilateral implanted tined leads. The median (range) test period was 30 (21-62 days). Bacterial growth (Staphylococcus species) was detected in four of 20 (20%) patients on seven of 37 (19%) explanted tined leads. There were clinical signs of infection in one of 20 (5%) patients at none of 37 tined lead, one of 20 (5%) gluteal, and none of 20 extension wire incision sites. There were no clinical signs of infection in the remaining three of four patients with bacterial growth. CONCLUSIONS: After prolonged tined-lead testing, we found an infection rate comparable to that reported with the usual short test period. In addition, most patients with bacterial growth on tined leads showed no clinical signs of infection. Thus, prolonged tined-lead testing does not seem to induce clinically relevant infection, warranting randomized trials.
Resumo:
Most criticism about homeopathy concerns the lack of a scientific basis and theoretical models. In order to be accepted as a valid part of medical practice, a wellstructured research strategy for homeopathy is needed. This is often hampered by methodological problems as well as by gross underinvestment in the required academic resources. Fundamental research could make important contributions to our understanding of the homeopathic and high dilutions mechanisms of action. Since the pioneering works of Kolisko on wheat germination (Kolisko, 1923) and Junker on growth of microorganisms (paramecium, yeast, fungi) (Junker, 1928), a number of experiments have been performed either with healthy organisms (various physiological aspects of growth) or with artificially diseased organisms, which may react more markedly to homeopathic treatments than healthy ones. In the latter case, the preliminary stress may be either abiotic, e.g. heavy metals, or biotic, e.g. fungal and viral pathogens or nematode infection. Research has also been carried out into the applicability of homeopathic principles to crop growth and disease control (agrohomeopathy): because of the extreme dilutions used, the environmental impact is low and such treatments are well suited to the holistic approach of sustainable agriculture (Betti et al., 2006). Unfortunately, as Scofield reported in an extensive critical review (Scofield, 1984), there is little firm evidence to support the reliability of the reported results, due to poor experimental methodology and inadequate statistical analysis. Moreover, since there is no agricultural homeopathic pharmacopoeia, much work is required to find suitable remedies, potencies and dose levels.
Resumo:
BACKGROUND: In recent years, treatment options for human immunodeficiency virus type 1 (HIV-1) infection have changed from nonboosted protease inhibitors (PIs) to nonnucleoside reverse-transcriptase inhibitors (NNRTIs) and boosted PI-based antiretroviral drug regimens, but the impact on immunological recovery remains uncertain. METHODS: During January 1996 through December 2004 [corrected] all patients in the Swiss HIV Cohort were included if they received the first combination antiretroviral therapy (cART) and had known baseline CD4(+) T cell counts and HIV-1 RNA values (n = 3293). For follow-up, we used the Swiss HIV Cohort Study database update of May 2007 [corrected] The mean (+/-SD) duration of follow-up was 26.8 +/- 20.5 months. The follow-up time was limited to the duration of the first cART. CD4(+) T cell recovery was analyzed in 3 different treatment groups: nonboosted PI, NNRTI, or boosted PI. The end point was the absolute increase of CD4(+) T cell count in the 3 treatment groups after the initiation of cART. RESULTS: Two thousand five hundred ninety individuals (78.7%) initiated a nonboosted-PI regimen, 452 (13.7%) initiated an NNRTI regimen, and 251 (7.6%) initiated a boosted-PI regimen. Absolute CD4(+) T cell count increases at 48 months were as follows: in the nonboosted-PI group, from 210 to 520 cells/muL; in the NNRTI group, from 220 to 475 cells/muL; and in the boosted-PI group, from 168 to 511 cells/muL. In a multivariate analysis, the treatment group did not affect the response of CD4(+) T cells; however, increased age, pretreatment with nucleoside reverse-transcriptase inhibitors, serological tests positive for hepatitis C virus, Centers for Disease Control and Prevention stage C infection, lower baseline CD4(+) T cell count, and lower baseline HIV-1 RNA level were risk factors for smaller increases in CD4(+) T cell count. CONCLUSION: CD4(+) T cell recovery was similar in patients receiving nonboosted PI-, NNRTI-, and boosted PI-based cART.
Resumo:
This study aimed to identify the microbial contamination of water from dental chair units (DCUs) using the prevalence of Pseudomonas aeruginosa, Legionella species and heterotrophic bacteria as a marker of pollution in water in the area of St. Gallen, Switzerland. Water (250 ml) from 76 DCUs was collected twice (early on a morning before using all the instruments and after using the DCUs for at least two hours) either from the high-speed handpiece tube, the 3 in 1 syringe or the micromotor for water quality testing. An increased bacterial count (>300 CFU/ml) was found in 46 (61%) samples taken before use of the DCU, but only in 29 (38%) samples taken two hours after use. Pseudomonas aeruginosa was found in both water samples in 6/76 (8%) of the DCUs. Legionella were found in both samples in 15 (20%) of the DCUs tested. Legionella anisa was identified in seven samples and Legionella pneumophila was found in eight. DCUs which were less than five years old were contaminated less often than older units (25% und 77%, p<0.001). This difference remained significant (0=0.0004) when adjusted for manufacturer and sampling location in a multivariable logistic regression. A large proportion of the DCUs tested did not comply with the Swiss drinking water standards nor with the recommendations of the American Centers for Disease Control and Prevention (CDC).
Resumo:
HYPOTHESIS: Clinically apparent surgical glove perforation increases the risk of surgical site infection (SSI). DESIGN: Prospective observational cohort study. SETTING: University Hospital Basel, with an average of 28,000 surgical interventions per year. PARTICIPANTS: Consecutive series of 4147 surgical procedures performed in the Visceral Surgery, Vascular Surgery, and Traumatology divisions of the Department of General Surgery. MAIN OUTCOME MEASURES: The outcome of interest was SSI occurrence as assessed pursuant to the Centers of Disease Control and Prevention standards. The primary predictor variable was compromised asepsis due to glove perforation. RESULTS: The overall SSI rate was 4.5% (188 of 4147 procedures). Univariate logistic regression analysis showed a higher likelihood of SSI in procedures in which gloves were perforated compared with interventions with maintained asepsis (odds ratio [OR], 2.0; 95% confidence interval [CI], 1.4-2.8; P < .001). However, multivariate logistic regression analyses showed that the increase in SSI risk with perforated gloves was different for procedures with vs those without surgical antimicrobial prophylaxis (test for effect modification, P = .005). Without antimicrobial prophylaxis, glove perforation entailed significantly higher odds of SSI compared with the reference group with no breach of asepsis (adjusted OR, 4.2; 95% CI, 1.7-10.8; P = .003). On the contrary, when surgical antimicrobial prophylaxis was applied, the likelihood of SSI was not significantly higher for operations in which gloves were punctured (adjusted OR, 1.3; 95% CI, 0.9-1.9; P = .26). CONCLUSION: Without surgical antimicrobial prophylaxis, glove perforation increases the risk of SSI.
Resumo:
BACKGROUND: The purpose of the study was to investigate allogeneic blood transfusion (ABT) and preoperative anemia as risk factors for surgical site infection (SSI). STUDY DESIGN AND METHODS: A prospective, observational cohort of 5873 consecutive general surgical procedures at Basel University Hospital was analyzed to determine the relationship between perioperative ABT and preoperative anemia and the incidence of SSI. ABT was defined as transfusion of leukoreduced red blood cells during surgery and anemia as hemoglobin concentration of less than 120 g/L before surgery. Surgical wounds and resulting infections were assessed to Centers for Disease Control standards. RESULTS: The overall SSI rate was 4.8% (284 of 5873). In univariable logistic regression analyses, perioperative ABT (crude odds ratio [OR], 2.93; 95% confidence interval [CI], 2.1 to 4.0; p < 0.001) and preoperative anemia (crude OR, 1.32; 95% CI, 1.0 to 1.7; p = 0.037) were significantly associated with an increased odds of SSI. After adjusting for 13 characteristics of the patient and the procedure in multivariable analyses, associations were substantially reduced for ABT (OR, 1.25; 95% CI, 0.8 to 1.9; p = 0.310; OR, 1.07; 95% CI, 0.6 to 2.0; p = 0.817 for 1-2 blood units and >or=3 blood units, respectively) and anemia (OR, 0.91; 95% CI, 0.7 to 1.2; p = 0.530). Duration of surgery was the main confounding variable. CONCLUSION: Our findings point to important confounding factors and strengthen existing doubts on leukoreduced ABT during general surgery and preoperative anemia as risk factors for SSIs.
Resumo:
BACKGROUND: To compare the incidence and timing of bone fractures in postmenopausal women treated with 5 years of adjuvant tamoxifen or letrozole for endocrine-responsive early breast cancer in the Breast International Group (BIG) 1-98 trial. METHODS: We evaluated 4895 patients allocated to 5 years of letrozole or tamoxifen in the BIG 1-98 trial who received at least some study medication (median follow-up 60.3 months). Bone fracture information (grade, cause, site) was collected every 6 months during trial treatment. RESULTS: The incidence of bone fractures was higher among patients treated with letrozole [228 of 2448 women (9.3%)] versus tamoxifen [160 of 2447 women (6.5%)]. The wrist was the most common site of fracture in both treatment groups. Statistically significant risk factors for bone fractures during treatment included age, smoking history, osteoporosis at baseline, previous bone fracture, and previous hormone replacement therapy. CONCLUSIONS: Consistent with other trials comparing aromatase inhibitors to tamoxifen, letrozole was associated with an increase in bone fractures. Benefits of superior disease control associated with letrozole and lower incidence of fracture with tamoxifen should be considered with the risk profile for individual patients.
Resumo:
Chronic myeloid leukemia (CML) is a clonal myeloproliferative neoplasia arising from the oncogenic break point cluster region/Abelson murine leukemia viral oncogene homolog 1 translocation in hematopoietic stem cells (HSCs), resulting in a leukemia stem cell (LSC). Curing CML depends on the eradication of LSCs. Unfortunately, LSCs are resistant to current treatment strategies. The host’s immune system is thought to contribute to disease control, and several immunotherapy strategies are under investigation. However, the interaction of the immune system with LSCs is poorly defined. In the present study, we use a murine CML model to show that LSCs express major histocompatibility complex (MHC) and co-stimulatory molecules and are recognized and killed by leukemia-specific CD8+ effector CTLs in vitro. In contrast, therapeutic infusions of effector CTLs into CML mice in vivo failed to eradicate LSCs but, paradoxically, increased LSC numbers. LSC proliferation and differentiation was induced by CTL-secreted IFN-γ. Effector CTLs were only able to eliminate LSCs in a situation with minimal leukemia load where CTL-secreted IFN-γ levels were low. In addition, IFN-γ increased proliferation and colony formation of CD34+ stem/progenitor cells from CML patients in vitro. Our study reveals a novel mechanism by which the immune system contributes to leukemia progression and may be important to improve T cell–based immunotherapy against leukemia.
Resumo:
Background Accidental poisoning is one of the leading causes of injury in the United States, second only to motor vehicle accidents. According to the Centers for Disease Control and Prevention, the rates of accidental poisoning mortality have been increasing in the past fourteen years nationally. In Texas, mortality rates from accidental poisoning have mirrored national trends, increasing linearly from 1981 to 2001. The purpose of this study was to determine if there are spatiotemporal clusters of accidental poisoning mortality among Texas counties, and if so, whether there are variations in clustering and risk according to gender and race/ethnicity. The Spatial Scan Statistic in combination with GIS software was used to identify potential clusters between 1980 and 2001 among Texas counties, and Poisson regression was used to evaluate risk differences. Results Several significant (p < 0.05) accidental poisoning mortality clusters were identified in different regions of Texas. The geographic and temporal persistence of clusters was found to vary by racial group, gender, and race/gender combinations, and most of the clusters persisted into the present decade. Poisson regression revealed significant differences in risk according to race and gender. The Black population was found to be at greatest risk of accidental poisoning mortality relative to other race/ethnic groups (Relative Risk (RR) = 1.25, 95% Confidence Interval (CI) = 1.24 – 1.27), and the male population was found to be at elevated risk (RR = 2.47, 95% CI = 2.45 – 2.50) when the female population was used as a reference. Conclusion The findings of the present study provide evidence for the existence of accidental poisoning mortality clusters in Texas, demonstrate the persistence of these clusters into the present decade, and show the spatiotemporal variations in risk and clustering of accidental poisoning deaths by gender and race/ethnicity. By quantifying disparities in accidental poisoning mortality by place, time and person, this study demonstrates the utility of the spatial scan statistic combined with GIS and regression methods in identifying priority areas for public health planning and resource allocation.
Resumo:
Tumors comprising the spectrum of hemangiopericytoma/ malignant solitary fibrous tumor (HPC/SFT) are thought to arise from fibroblasts and represent a small subset of soft tissue sarcomas. Surgery is typically the treatment of choice for localized disease, with reported 10-year overall survival rates of 54-89% after complete surgical resection. However, for the approximately 20% of HPC/SFT patients who eventually develop local recurrences and/or distant metastases, options for effective treatment are limited and are poorly defined. Alternative therapeutic options are therefore needed for improved palliation and disease control. We hypothesize that HPC/SFT are a spectrum of soft tissue tumors with unique clinical, pathological, and molecular makeup and clinical behavior. HPC/SFT respond to unique therapeutic agents that specifically target aberrations specific to these tumors. We retrospectively reviewed the characteristics and the clinical outcomes for all HPC/SFT patients whose tumor specimens have been reviewed at the MD Anderson Cancer Center from January 1993 to June 2007 by a MD Anderson pathologist and were treated at the institution with available electronic medical records. We identified 128 patients, 79 with primary localized disease and 49 with recurrent and/or metastatic disease. For the 23 patients with advanced HPC/SFT who received adriamycin-based, gemcitabine based, or paclitaxel chemotherapy as first- or second-line therapy, the overall RECIST response rate was 0%. Most patients achieved a brief duration of disease stabilization on chemotherapy, with median progression-free survival (PFS) period of 4.6 months. For the 14 patients with advanced HPC/SFT who received temozolomide and bevacizumab systemic therapy, the overall RECIST response rate was 14%, with the overall Choi response rate of 79%. The median PFS for the cohort was 9.7 months with a median 6-month progression free rate of 78.6%. The most frequently observed toxic effect of temzolomide-bevacizumab therapy was myelosuppression. We have designed a phase II study to evaluate the safety and efficacy of temozolomide-bevaciumab in locally advanced, recurrent, and metastatic HPC/SFT in a prospective manner. Combination therapy with temozolomide and bevacizumab may be a potentially clinically beneficial regimen for advanced HPC/SFT patients.