51 resultados para 860[729.1].07[Sarduy]


Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Antiretroviral therapy (ART) decreases morbidity and mortality in HIV-infected patients but is associated with considerable adverse events (AEs). METHODS: We examined the effect of AEs to ART on mortality, treatment modifications and drop-out in the Swiss HIV Cohort Study. A cross-sectional evaluation of prevalence of 13 clinical and 11 laboratory parameters was performed in 1999 in 1,078 patients on ART. AEs were defined as abnormalities probably or certainly related to ART. A score including the number and severity of AEs was defined. The subsequent progression to death, drop-out and treatment modification due to intolerance were evaluated according to the baseline AE score and characteristics of individual AEs. RESULTS: Of the 1,078 patients, laboratory AEs were reported in 23% and clinical AEs in 45%. During a median follow up of 5.9 years, laboratory AEs were associated with higher mortality with an adjusted hazard ratio (HR) of 1.3 (95% confidence interval [CI] 1.2-1.5; P < 0.001) per score point. For clinical AEs no significant association with increased mortality was found. In contrast, an increasing score for clinical AEs (HR 1.11,95% CI 1.04-1.18; P = 0.002), but not for laboratory AEs (HR 1.07, 95% CI 0.97-1.17; P = 0.17), was associated with antiretroviral treatment modification. AEs were not associated with a higher drop-out rate. CONCLUSIONS: The burden of laboratory AEs to antiretroviral drugs is associated with a higher mortality. Physicians seem to change treatments to relieve clinical symptoms, while accepting laboratory AEs. Minimizing laboratory drug toxicity seems warranted and its influence on survival should be further evaluated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: To examine variability in outcome and resource use between ICUs. Secondary aims: to assess whether outcome and resource use are related to ICU structure and process, to explore factors associated with efficient resource use. DESIGN AND SETTING: Cohort study, based on the SAPS 3 database in 275 ICUs worldwide. PATIENTS: 16,560 adults. MEASUREMENTS AND RESULTS: Outcome was defined by standardized mortality rate (SMR). Standardized resource use (SRU) was calculated based on length of stay in the ICU, adjusted for severity of acute illness. Each unit was assigned to one of four groups: "most efficient" (SMR and SRU < median); "least efficient" (SMR, SRU > median); "overachieving" (low SMR, high SRU), "underachieving" (high SMR, low SRU). Univariate analysis and stepwise logistic regression were used to test for factors separating "most" from "least efficient" units. Overall median SMR was 1.00 (IQR 0.77-1.28) and SRU 1.07 (0.76-1.58). There were 91 "most efficient", 91 "least efficient", 47 "overachieving", and 46 "underachieving" ICUs. Number of physicians, of full-time specialists, and of nurses per bed, clinical rounds, availability of physicians, presence of emergency department, and geographical region were significant in univariate analysis. In multivariate analysis only interprofessional rounds, emergency department, and geographical region entered the model as significant. CONCLUSIONS: Despite considerable variability in outcome and resource use only few factors of ICU structure and process were associated with efficient use of ICU. This suggests that other confounding factors play an important role.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Streptococcus sinensis has been described as a causative organism for infective endocarditis in 3 Chinese patients from Hong Kong. We describe a closely related strain in an Italian patient with chronic rheumatic heart disease. The case illustrates that S. sinensis is a worldwide emerging pathogen.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE: The objective of this study was to evaluate the feasibility and reproducibility of high-resolution magnetic resonance imaging (MRI) and quantitative T2 mapping of the talocrural cartilage within a clinically applicable scan time using a new dedicated ankle coil and high-field MRI. MATERIALS AND METHODS: Ten healthy volunteers (mean age 32.4 years) underwent MRI of the ankle. As morphological sequences, proton density fat-suppressed turbo spin echo (PD-FS-TSE), as a reference, was compared with 3D true fast imaging with steady-state precession (TrueFISP). Furthermore, biochemical quantitative T2 imaging was prepared using a multi-echo spin-echo T2 approach. Data analysis was performed three times each by three different observers on sagittal slices, planned on the isotropic 3D-TrueFISP; as a morphological parameter, cartilage thickness was assessed and for T2 relaxation times, region-of-interest (ROI) evaluation was done. Reproducibility was determined as a coefficient of variation (CV) for each volunteer; averaged as root mean square (RMSA) given as a percentage; statistical evaluation was done using analysis of variance. RESULTS: Cartilage thickness of the talocrural joint showed significantly higher values for the 3D-TrueFISP (ranging from 1.07 to 1.14 mm) compared with the PD-FS-TSE (ranging from 0.74 to 0.99 mm); however, both morphological sequences showed comparable good results with RMSA of 7.1 to 8.5%. Regarding quantitative T2 mapping, measurements showed T2 relaxation times of about 54 ms with an excellent reproducibility (RMSA) ranging from 3.2 to 4.7%. CONCLUSION: In our study the assessment of cartilage thickness and T2 relaxation times could be performed with high reproducibility in a clinically realizable scan time, demonstrating new possibilities for further investigations into patient groups.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The purpose was to evaluate the relative glycosaminoglycan (GAG) content of repair tissue in patients after microfracturing (MFX) and matrix-associated autologous chondrocyte transplantation (MACT) of the knee joint with a dGEMRIC technique based on a newly developed short 3D-GRE sequence with two flip angle excitation pulses. Twenty patients treated with MFX or MACT (ten in each group) were enrolled. For comparability, patients from each group were matched by age (MFX: 37.1 +/- 16.3 years; MACT: 37.4 +/- 8.2 years) and postoperative interval (MFX: 33.0 +/- 17.3 months; MACT: 32.0 +/- 17.2 months). The Delta relaxation rate (DeltaR1) for repair tissue and normal hyaline cartilage and the relative DeltaR1 were calculated, and mean values were compared between both groups using an analysis of variance. The mean DeltaR1 for MFX was 1.07 +/- 0.34 versus 0.32 +/- 0.20 at the intact control site, and for MACT, 1.90 +/- 0.49 compared to 0.87 +/- 0.44, which resulted in a relative DeltaR1 of 3.39 for MFX and 2.18 for MACT. The difference between the cartilage repair groups was statistically significant. The new dGEMRIC technique based on dual flip angle excitation pulses showed higher GAG content in patients after MACT compared to MFX at the same postoperative interval and allowed reducing the data acquisition time to 4 min.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The relation between residential magnetic field exposure from power lines and mortality from neurodegenerative conditions was analyzed among 4.7 million persons of the Swiss National Cohort (linking mortality and census data), covering the period 2000-2005. Cox proportional hazard models were used to analyze the relation of living in the proximity of 220-380 kV power lines and the risk of death from neurodegenerative diseases, with adjustment for a range of potential confounders. Overall, the adjusted hazard ratio for Alzheimer's disease in persons living within 50 m of a 220-380 kV power line was 1.24 (95% confidence interval (CI): 0.80, 1.92) compared with persons who lived at a distance of 600 m or more. There was a dose-response relation with respect to years of residence in the immediate vicinity of power lines and Alzheimer's disease: Persons living at least 5 years within 50 m had an adjusted hazard ratio of 1.51 (95% CI: 0.91, 2.51), increasing to 1.78 (95% CI: 1.07, 2.96) with at least 10 years and to 2.00 (95% CI: 1.21, 3.33) with at least 15 years. The pattern was similar for senile dementia. There was little evidence for an increased risk of amyotrophic lateral sclerosis, Parkinson's disease, or multiple sclerosis.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: The aim of this systematic review was to analyze the dental literature regarding accuracy and clinical application in computer-guided template-based implant dentistry. Materials and methods: An electronic literature search complemented by manual searching was performed to gather data on accuracy and surgical, biological and prosthetic complications in connection with computer-guided implant treatment. For the assessment of accuracy meta-regression analysis was performed. Complication rates are descriptively summarized. Results: From 3120 titles after the literature search, eight articles met the inclusion criteria regarding accuracy and 10 regarding the clinical performance. Meta-regression analysis revealed a mean deviation at the entry point of 1.07 mm (95% CI: 0.76-1.22 mm) and at the apex of 1.63 mm (95% CI: 1.26-2 mm). No significant differences between the studies were found regarding method of template production or template support and stabilization. Early surgical complications occurred in 9.1%, early prosthetic complications in 18.8% and late prosthetic complications in 12% of the cases. Implant survival rates of 91-100% after an observation time of 12-60 months are reported in six clinical studies with 537 implants mainly restored immediately after flapless implantation procedures. Conclusion: Computer-guided template-based implant placement showed high implant survival rates ranging from 91% to 100%. However, a considerable number of technique-related perioperative complications were observed. Preclinical and clinical studies indicated a reasonable mean accuracy with relatively high maximum deviations. Future research should be directed to increase the number of clinical studies with longer observation periods and to improve the systems in terms of perioperative handling, accuracy and prosthetic complications.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: The purpose of the study was to investigate allogeneic blood transfusion (ABT) and preoperative anemia as risk factors for surgical site infection (SSI). STUDY DESIGN AND METHODS: A prospective, observational cohort of 5873 consecutive general surgical procedures at Basel University Hospital was analyzed to determine the relationship between perioperative ABT and preoperative anemia and the incidence of SSI. ABT was defined as transfusion of leukoreduced red blood cells during surgery and anemia as hemoglobin concentration of less than 120 g/L before surgery. Surgical wounds and resulting infections were assessed to Centers for Disease Control standards. RESULTS: The overall SSI rate was 4.8% (284 of 5873). In univariable logistic regression analyses, perioperative ABT (crude odds ratio [OR], 2.93; 95% confidence interval [CI], 2.1 to 4.0; p < 0.001) and preoperative anemia (crude OR, 1.32; 95% CI, 1.0 to 1.7; p = 0.037) were significantly associated with an increased odds of SSI. After adjusting for 13 characteristics of the patient and the procedure in multivariable analyses, associations were substantially reduced for ABT (OR, 1.25; 95% CI, 0.8 to 1.9; p = 0.310; OR, 1.07; 95% CI, 0.6 to 2.0; p = 0.817 for 1-2 blood units and >or=3 blood units, respectively) and anemia (OR, 0.91; 95% CI, 0.7 to 1.2; p = 0.530). Duration of surgery was the main confounding variable. CONCLUSION: Our findings point to important confounding factors and strengthen existing doubts on leukoreduced ABT during general surgery and preoperative anemia as risk factors for SSIs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Equine influenza virus (EIV) surveillance is important in the management of equine influenza. It provides data on circulating and newly emerging strains for vaccine strain selection. To this end, antigenic characterisation by haemaggluttination inhibition (HI) assay and phylogenetic analysis was carried out on 28 EIV strains isolated in North America and Europe during 2006 and 2007. In the UK, 20 viruses were isolated from 28 nasopharyngeal swabs that tested positive by enzyme-linked immunosorbent assay. All except two of the UK viruses were characterised as members of the Florida sublineage with similarity to A/eq/Newmarket/5/03 (clade 2). One isolate, A/eq/Cheshire/1/06, was characterised as an American lineage strain similar to viruses isolated up to 10 years earlier. A second isolate, A/eq/Lincolnshire/1/07 was characterised as a member of the Florida sublineage (clade 1) with similarity to A/eq/Wisconsin/03. Furthermore, A/eq/Lincolnshire/1/06 was a member of the Florida sublineage (clade 2) by haemagglutinin (HA) gene sequence, but appeared to be a member of the Eurasian lineage by the non-structural gene (NS) sequence suggesting that reassortment had occurred. A/eq/Switzerland/P112/07 was characterised as a member of the Eurasian lineage, the first time since 2005 that isolation of a virus from this lineage has been reported. Seven viruses from North America were classified as members of the Florida sublineage (clade 1), similar to A/eq/Wisconsin/03. In conclusion, a variety of antigenically distinct EIVs continue to circulate worldwide. Florida sublineage clade 1 viruses appear to predominate in North America, clade 2 viruses in Europe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES To identify factors associated with discrepant outcome reporting in randomized drug trials. STUDY DESIGN AND SETTING Cohort study of protocols submitted to a Swiss ethics committee 1988-1998: 227 protocols and amendments were compared with 333 matching articles published during 1990-2008. Discrepant reporting was defined as addition, omission, or reclassification of outcomes. RESULTS Overall, 870 of 2,966 unique outcomes were reported discrepantly (29.3%). Among protocol-defined primary outcomes, 6.9% were not reported (19 of 274), whereas 10.4% of reported outcomes (30 of 288) were not defined in the protocol. Corresponding percentages for secondary outcomes were 19.0% (284 of 1,495) and 14.1% (334 of 2,375). Discrepant reporting was more likely if P values were <0.05 compared with P ≥ 0.05 [adjusted odds ratio (aOR): 1.38; 95% confidence interval (CI): 1.07, 1.78], more likely for efficacy compared with harm outcomes (aOR: 2.99; 95% CI: 2.08, 4.30) and more likely for composite than for single outcomes (aOR: 1.48; 95% CI: 1.00, 2.20). Cardiology (aOR: 2.34; 95% CI: 1.44, 3.79) and infectious diseases (aOR: 1.77; 95% CI: 1.01, 3.13) had more discrepancies compared with all specialties combined. CONCLUSION Discrepant reporting was associated with statistical significance of results, type of outcome, and specialty area. Trial protocols should be made freely available, and the publications should describe and justify any changes made to protocol-defined outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES The impact of diagnostic delay (a period from appearance of first symptoms to diagnosis) on the clinical course of Crohn's disease (CD) is unknown. We examined whether length of diagnostic delay affects disease outcomes. METHODS Data from the Swiss IBD cohort study were analyzed. Patients were recruited from university centers (68%), regional hospitals (14%), and private practices (18%). The frequencies of occurrence of bowel stenoses, internal fistulas, perianal fistulas, and CD-related surgery (intestinal and perianal) were analyzed. RESULTS A total of 905 CD patients (53.4% female, median age at diagnosis 26 (20-36) years) were stratified into four groups according to the quartiles of diagnostic delay (0-3, 4-9, 10-24, and ≥25 months, respectively). Median diagnostic delay was 9 (3-24) months. The frequency of immunomodulator and/or antitumor necrosis factor drug use did not differ among the four groups. The length of diagnostic delay was positively correlated with the occurrence of bowel stenosis (odds ratio (OR) 1.76, P=0.011 for delay of ≥25 months) and intestinal surgery (OR 1.76, P=0.014 for delay of 10-24 months and OR 2.03, P=0.003 for delay of ≥25 months). Disease duration was positively associated and non-ileal disease location was negatively associated with bowel stenosis (OR 1.07, P<0.001, and OR 0.41, P=0.005, respectively) and intestinal surgery (OR 1.14, P<0.001, and OR 0.23, P<0.001, respectively). CONCLUSIONS The length of diagnostic delay is correlated with an increased risk of bowel stenosis and CD-related intestinal surgery. Efforts should be undertaken to shorten the diagnostic delay.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES Mortality in patients starting antiretroviral therapy (ART) is higher in Malawi and Zambia than in South Africa. We examined whether different monitoring of ART (viral load [VL] in South Africa and CD4 count in Malawi and Zambia) could explain this mortality difference. DESIGN Mathematical modelling study based on data from ART programmes. METHODS We used a stochastic simulation model to study the effect of VL monitoring on mortality over 5 years. In baseline scenario A all parameters were identical between strategies except for more timely and complete detection of treatment failure with VL monitoring. Additional scenarios introduced delays in switching to second-line ART (scenario B) or higher virologic failure rates (due to worse adherence) when monitoring was based on CD4 counts only (scenario C). Results are presented as relative risks (RR) with 95% prediction intervals and percent of observed mortality difference explained. RESULTS RRs comparing VL with CD4 cell count monitoring were 0.94 (0.74-1.03) in scenario A, 0.94 (0.77-1.02) with delayed switching (scenario B) and 0.80 (0.44-1.07) when assuming a 3-times higher rate of failure (scenario C). The observed mortality at 3 years was 10.9% in Malawi and Zambia and 8.6% in South Africa (absolute difference 2.3%). The percentage of the mortality difference explained by VL monitoring ranged from 4% (scenario A) to 32% (scenarios B and C combined, assuming a 3-times higher failure rate). Eleven percent was explained by non-HIV related mortality. CONCLUSIONS VL monitoring reduces mortality moderately when assuming improved adherence and decreased failure rates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND There is weak evidence to support the benefit of periodontal maintenance therapy in preventing tooth loss. In addition, the effects of long-term periodontal treatment on general health are unclear. METHODS Patients who were compliant and partially compliant (15 to 25 years' follow-up) in private practice were observed for oral and systemic health changes. RESULTS A total of 219 patients who were compliant (91 males and 128 females) were observed for 19.1 (range 15 to 25; SD ± 2.8) years. Age at reassessment was 64.6 (range: 39 to 84; SD ± 9.0) years. A total of 145 patients were stable (0 to 3 teeth lost), 54 were downhill (4 to 6 teeth lost), and 21 patients extreme downhill (>6 teeth lost); 16 patients developed hypertension, 13 developed type 2 diabetes, and 15 suffered myocardial infarcts (MIs). A minority developed other systemic diseases. Risk factors for MI included overweight (odds ratio [OR]: 9.04; 95% confidence interval [CI]: 2.9 to 27.8; P = 0.000), family history with cardiovascular disease (OR: 3.10; 95% CI: 1.07 to 8.94; P = 0.029), type 1 diabetes at baseline (P = 0.02), and developing type 2 diabetes (OR: 7.9; 95% CI: 2.09 to 29.65; P = 0.000). A total of 25 patients who were partially compliant (17 males and eight females) were observed for 19 years. This group had a higher proportion of downhill and extreme downhill cases and MI. CONCLUSIONS Patients who left the maintenance program in a periodontal specialist practice in Norway had a higher rate of tooth loss than patients who were compliant. Patients who were compliant with maintenance in a specialist practice in Norway have a similar risk of developing type 2 diabetes as the general population. A rate of 0.0037 MIs per patient per year was recorded for this group. Due to the lack of external data, it is difficult to assess how this compares with patients who have periodontal disease and are untreated.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Lung function measures are heritable, predict mortality and are relevant in diagnosis of chronic obstructive pulmonary disease (COPD). COPD and asthma are diseases of the airways with major public health impacts and each have a heritable component. Genome-wide association studies of SNPs have revealed novel genetic associations with both diseases but only account for a small proportion of the heritability. Complex copy number variation may account for some of the missing heritability. A well-characterised genomic region of complex copy number variation contains beta-defensin genes (DEFB103, DEFB104 and DEFB4), which have a role in the innate immune response. Previous studies have implicated these and related genes as being associated with asthma or COPD. We hypothesised that copy number variation of these genes may play a role in lung function in the general population and in COPD and asthma risk. We undertook copy number typing of this locus in 1149 adult and 689 children using a paralogue ratio test and investigated association with COPD, asthma and lung function. Replication of findings was assessed in a larger independent sample of COPD cases and smoking controls. We found evidence for an association of beta-defensin copy number with COPD in the adult cohort (OR = 1.4, 95%CI:1.02-1.92, P = 0.039) but this finding, and findings from a previous study, were not replicated in a larger follow-up sample(OR = 0.89, 95%CI:0.72-1.07, P = 0.217). No robust evidence of association with asthma in children was observed. We found no evidence for association between beta-defensin copy number and lung function in the general populations. Our findings suggest that previous reports of association of beta-defensin copy number with COPD should be viewed with caution. Suboptimal measurement of copy number can lead to spurious associations. Further beta-defensin copy number measurement in larger sample sizes of COPD cases and children with asthma are needed.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Aims: Newer-generation everolimus-eluting stents (EES) have been shown to improve clinical outcomes compared with early-generation sirolimus-eluting (SES) and paclitaxel-eluting stents (PES) in patients undergoing percutaneous coronary intervention (PCI). Whether this benefit is maintained among patients with saphenous vein graft (SVG) disease remains controversial. Methods and results: We assessed cumulative incidence rates (CIR) per 100 patient years after inverse probability of treatment weighting to compare clinical outcomes. The pre-specified primary endpoint was the composite of cardiac death, myocardial infarction (MI), and target vessel revascularisation (TVR). Out of 12,339 consecutively treated patients, 288 patients (5.7%) underwent PCI of at least one SVG lesion with EES (n=127), SES (n=103) or PES (n=58). Up to four years, CIR of the primary endpoint were 58.7 for EES, 45.2 for SES and 45.6 for PES with similar adjusted risks between groups (EES vs. SES; HR 0.94, 95% CI: 0.55-1.60, EES vs. PES; HR 1.07, 95% CI: 0.60-1.91). Adjusted risks showed no significant differences between stent types for cardiac death, MI and TVR. Conclusions: Among patients undergoing PCI for SVG lesions, newer-generation EES have similar safety and efficacy to early-generation SES and PES during long-term follow-up to four years.