127 resultados para non-diversifiable risk
Resumo:
AIMS: Recent studies of drug-eluting stents for unprotected left main coronary artery (LMCA) disease have been encouraging. We examined the performance of sirolimus-eluting stents (SES) for this indication. METHODS AND RESULTS: This retrospective study included 228 consecutive patients (mean age = 68 +/- 11 years, 80.6% men, 26.3% diabetics) who underwent implantation of SES for de novo LMCA stenoses. The mean additive and logistic EuroSCOREs were 5.2 +/- 3.9 and 8.2 +/- 13.2, respectively. The main objective of this study was to measure the rate of major adverse cardiac events (MACE), including death, myocardial infarction and target lesion revascularisation (TLR) at 12 months. Other objectives were to measure the rates of in-hospital MACE and 12-month TLR. Outcomes in 143 patients with (BIF+ group), versus 84 patients without (BIF-group) involvement of the bifurcation were compared. The pre-procedural percent diameter stenosis (%DS) was 60.1 +/- 11.2 in the BIF+ versus 54.7 +/- 12.2% in the BIF- group (p=0.008), and decreased to 18.0 +/- 9.7 and 13.9 +/- 11.3%, respectively (ns), after SES implant. The overall in-hospital MACE rate was 3.5%, and similar in both subgroups. The 1-year MACE rate was 14.5% overall, 16.8% in the BIF+ and 10.7% in the BIF- subgroup (ns). CONCLUSIONS: SES implants in high-risk patients with LMCA stenoses were associated with a low 1-year MACE rate. Stenting of the bifurcation was associated with significant increases in neither mortality nor 1-year MACE rate.
Resumo:
Intracerebral contusions can lead to regional ischemia caused by extensive release of excitotoxic aminoacids leading to increased cytotoxic brain edema and raised intracranial pressure. rCBF measurements might provide further information about the risk of ischemia within and around contusions. Therefore, the aim of the presented study was to compare the intra- and perilesional rCBF of hemorrhagic, non-hemorrhagic and mixed intracerebral contusions. In 44 patients, 60 stable Xenon-enhanced CT CBF-studies were performed (EtCO2 30 +/- 4 mmHg SD), initially 29 hours (39 studies) and subsequent 95 hours after injury (21 studies). All lesions were classified according to localization and lesion type using CT/MRI scans. The rCBF was calculated within and 1-cm adjacent to each lesion in CT-isodens brain. The rCBF within all contusions (n = 100) of 29 +/- 11 ml/100 g/min was significantly lower (p < 0.0001, Mann-Whitney U) compared to perilesional rCBF of 44 +/- 12 ml/100 g/min and intra/perilesional correlation was 0.4 (p < 0.0005). Hemorrhagic contusions showed an intra/perilesional rCBF of 31 +/- 11/44 +/- 13 ml/100 g/min (p < 0.005), non-hemorrhagic contusions 35 +/- 13/46 +/- 10 ml/100 g/min (p < 0.01). rCBF in mixed contusions (25 +/- 9/44 +/- 12 ml/100 g/min, p < 0.0001) was significantly lower compared to hemorrhagic and non-hemorrhagic contusions (p < 0.02). Intracontusional rCBF is significantly reduced to 29 +/- 11 ml/100 g/min but reduced below ischemic levels of 18 ml/100 g/min in only 16% of all contusions. Perilesional CBF in CT normal appearing brain closed to contusions is not critically reduced. Further differentiation of contusions demonstrates significantly lower rCBF in mixed contusions (defined by both hyper- and hypodense areas in the CT-scan) compared to hemorrhagic and non-hemorrhagic contusions. Mixed contusions may evolve from hemorrhagic contusions with secondary increased perilesional cytotoxic brain edema leading to reduced cerebral blood flow and altered brain metabolism. Therefore, the treatment of ICP might be individually modified by the measurement of intra- and pericontusional cerebral blood.
Resumo:
BACKGROUND: Patients with chemotherapy-related neutropenia and fever are usually hospitalized and treated on empirical intravenous broad-spectrum antibiotic regimens. Early diagnosis of sepsis in children with febrile neutropenia remains difficult due to non-specific clinical and laboratory signs of infection. We aimed to analyze whether IL-6 and IL-8 could define a group of patients at low risk of septicemia. METHODS: A prospective study was performed to assess the potential value of IL-6, IL-8 and C-reactive protein serum levels to predict severe bacterial infection or bacteremia in febrile neutropenic children with cancer during chemotherapy. Statistical test used: Friedman test, Wilcoxon-Test, Kruskal-Wallis H test, Mann-Whitney U-Test and Receiver Operating Characteristics. RESULTS: The analysis of cytokine levels measured at the onset of fever indicated that IL-6 and IL-8 are useful to define a possible group of patients with low risk of sepsis. In predicting bacteremia or severe bacterial infection, IL-6 was the best predictor with the optimum IL-6 cut-off level of 42 pg/ml showing a high sensitivity (90%) and specificity (85%). CONCLUSION: These findings may have clinical implications for risk-based antimicrobial treatment strategies.
Resumo:
Karyotype analysis of acute lymphoblastic leukemia (ALL) at diagnosis has provided valuable prognostic markers for treatment stratification. However, reports of cytogenetic studies of relapsed ALL samples are limited. We compared the karyotypes from 436 nonselected B-cell precursor ALL patients at initial diagnosis and of 76 patients at first relapse. We noticed a relative increase of karyotypes that did not fall into the classic ALL cytogenetic subgroups (high hyperdiploidy, t(12;21), t(9;22), 11q23, t(1;19), <45 chromosomes) in a group of 29 patients at relapse (38%) compared to 130 patients at presentation (30%). Non-classical cytogenetic aberrations in these 29 patients were mostly found on chromosomes 1, 2, 7, 9, 13, 14, and 17. We also describe six rare reciprocal translocations, three of which involved 14q32. The most frequent abnormalities were found in 9p (12/29 cases) and were associated with a marked decrease in the duration of the second remission, but not of the probability of 10-year event-free survival after relapse treatment. From 29 patients with non-classical cytogenetic aberrations, only 8 (28%) had been stratified to a high risk-arm on the first treatment protocol, suggesting that this subgroup might benefit from the identification of new prognostic markers in future studies.
Resumo:
The rate-limiting step of dietary calcium absorption in the intestine requires the brush border calcium entry channel TRPV6. The TRPV6 gene was completely sequenced in 170 renal calcium stone patients. The frequency of an ancestral TRPV6 haplotype consisting of three non-synonymous polymorphisms (C157R, M378V, M681T) was significantly higher (P = 0.039) in calcium stone formers (8.4%; derived = 502, ancestral = 46) compared to non-stone-forming individuals (5.4%; derived = 645, ancestral = 37). Mineral metabolism was investigated on four different calcium regimens: (i) free-choice diet, (ii) low calcium diet, (iii) fasting and (iv) after a 1 g oral calcium load. When patients homozygous for the derived haplotype were compared with heterozygous patients, no differences were found with respect to the plasma concentrations of 1,25-vitamin D, PTH and calcium, and the urinary excretion of calcium. In one stone-forming patient, the ancestral haplotype was found to be homozygous. This patient had absorptive hypercalciuria. We therefore expressed the ancestral protein (157R+378V+681T) in Xenopus oocytes and found a significantly enhanced calcium permeability when tested by a (45)Ca(2+) uptake assay (7.11 +/- 1.93 versus 3.61 +/- 1.01 pmol/min/oocyte for ancestral versus derived haplotype, P < 0.01). These results suggest that the ancestral gain-of-function haplotype in TRPV6 plays a role in calcium stone formation in certain forms of absorptive hypercalciuria.
Resumo:
BACKGROUND: The aim of this study was to explore the predictive value of longitudinal self-reported adherence data on viral rebound. METHODS: Individuals in the Swiss HIV Cohort Study on combined antiretroviral therapy (cART) with RNA <50 copies/ml over the previous 3 months and who were interviewed about adherence at least once prior to 1 March 2007 were eligible. Adherence was defined in terms of missed doses of cART (0, 1, 2 or >2) in the previous 28 days. Viral rebound was defined as RNA >500 copies/ml. Cox regression models with time-independent and -dependent covariates were used to evaluate time to viral rebound. RESULTS: A total of 2,664 individuals and 15,530 visits were included. Across all visits, missing doses were reported as follows: 1 dose 14.7%, 2 doses 5.1%, >2 doses 3.8% taking <95% of doses 4.5% and missing > or =2 consecutive doses 3.2%. In total, 308 (11.6%) patients experienced viral rebound. After controlling for confounding variables, self-reported non-adherence remained significantly associated with the rate of occurrence of viral rebound (compared with zero missed doses: 1 dose, hazard ratio [HR] 1.03, 95% confidence interval [CI] 0.72-1.48; 2 doses, HR 2.17, 95% CI 1.46-3.25; >2 doses, HR 3.66, 95% CI 2.50-5.34). Several variables significantly associated with an increased risk of viral rebound irrespective of adherence were identified: being on a protease inhibitor or triple nucleoside regimen (compared with a non-nucleoside reverse transcriptase inhibitor), >5 previous cART regimens, seeing a less-experienced physician, taking co-medication, and a shorter time virally suppressed. CONCLUSIONS: A simple self-report adherence questionnaire repeatedly administered provides a sensitive measure of non-adherence that predicts viral rebound.
Resumo:
OBJECTIVE: To assess the long-term effect of HAART on non-Hodgkin lymphoma (NHL) incidence in people with HIV (PHIV). DESIGN: Follow-up of the Swiss HIV Cohort Study (SHCS). METHODS: Between 1984 and 2006, 12 959 PHIV contributed a total of 75 222 person-years (py), of which 36 787 were spent under HAART. Among these PHIV, 429 NHL cases were identified from the SHCS dataset and/or by record linkage with Swiss Cantonal Cancer Registries. Age- and gender-standardized incidence was calculated and Cox regression was used to estimate hazard ratios (HR). RESULTS: NHL incidence reached 13.6 per 1000 py in 1993-1995 and declined to 1.8 in 2002-2006. HAART use was associated with a decline in NHL incidence [HR = 0.26; 95% confidence interval (CI), 0.20-0.33], and this decline was greater for primary brain lymphomas than other NHL. Among non-HAART users, being a man having sex with men, being 35 years of age or older, or, most notably, having low CD4 cell counts at study enrollment (HR = 12.26 for < 50 versus >or= 350 cells/microl; 95% CI, 8.31-18.07) were significant predictors of NHL onset. Among HAART users, only age was significantly associated with NHL risk. The HR for NHL declined steeply in the first months after HAART initiation (HR = 0.46; 95% CI, 0.27-0.77) and was 0.12 (95% CI, 0.05-0.25) 7 to10 years afterwards. CONCLUSIONS: HAART greatly reduced the incidence of NHL in PHIV, and the influence of CD4 cell count on NHL risk. The beneficial effect remained strong up to 10 years after HAART initiation.
Resumo:
OBJECTIVE: To investigate HIV-related immunodeficiency as a risk factor for hepatocellular carcinoma (HCC) among persons infected with HIV, while controlling for the effect of frequent coinfection with hepatitis C and B viruses. DESIGN: A case-control study nested in the Swiss HIV Cohort Study. METHODS: Twenty-six HCC patients were identified in the Swiss HIV Cohort Study or through linkage with Swiss Cancer Registries, and were individually matched to 251 controls according to Swiss HIV Cohort Study centre, sex, HIV-transmission category, age and year at enrollment. Odds ratios and corresponding confidence intervals were estimated by conditional logistic regression. RESULTS: All HCC patients were positive for hepatitis B surface antigen or antibodies against hepatitis C virus. HCC patients included 14 injection drug users (three positive for hepatitis B surface antigen and 13 for antibodies against hepatitis C virus) and 12 men having sex with men/heterosexual/other (11 positive for hepatitis B surface antigen, three for antibodies against hepatitis C virus), revealing a strong relationship between HIV transmission route and hepatitis viral type. Latest CD4+ cell count [Odds ratio (OR) per 100 cells/mul decrease = 1.33, 95% confidence interval (CI) 1.06-1.68] and CD4+ cell count percentage (OR per 10% decrease = 1.65, 95% CI 1.01-2.71) were significantly associated with HCC. The effects of CD4+ cell count were concentrated among men having sex with men/heterosexual/other rather than injecting drug users. Highly active antiretroviral therapy use was not significantly associated with HCC risk (OR for ever versus never = 0.59, 95% confidence interval 0.18-1.91). CONCLUSION: Lower CD4+ cell counts increased the risk for HCC among persons infected with HIV, an effect that was particularly evident for hepatitis B virus-related HCC arising in non-injecting drug users.
Resumo:
Context: In the Health Outcomes and Reduced Incidence with Zoledronic Acid Once Yearly - Pivotal Fracture Trial (HORIZON-PFT), zoledronic acid (ZOL) 5 mg significantly reduced fracture risk. Objective: To identify factors associated with greater efficacy during ZOL 5 mg treatment. Design, Setting and Patients: Subgroup analysis (preplanned and post hoc) of a multicenter, double-blind, placebo-controlled, 36-month trial in 7765 women with postmenopausal osteoporosis. Intervention: Single infusion of ZOL 5 mg or placebo at baseline, 12 and 24 months. Main Outcome Measures: Primary endpoints: new vertebral fracture and hip fracture. Secondary endpoints: non-vertebral fracture, change in femoral neck bone mineral density (BMD). Baseline risk factor subgroups: age, BMD T-score and vertebral fracture status, total hip BMD, race, weight, geographical region, smoking, height loss, history of falls, physical activity, prior bisphosphonates, creatinine clearance, body mass index (BMI), concomitant osteoporosis medications. Results: Greater ZOL induced effects on vertebral fracture risk with younger age (treatment-by-subgroup interaction P=0.05), normal creatinine clearance (P=0.04), and BMI >/=25 kg/m(2) (P=0.02). There were no significant treatment-factor interactions for hip or non-vertebral fracture or for change in BMD. Conclusions: ZOL appeared more effective in preventing vertebral fracture in younger women, overweight/obese women and women with normal renal function. ZOL had similar effects irrespective of fracture risk factors or femoral neck BMD.
Resumo:
OBJECTIVE: We examined survival and prognostic factors of patients who developed HIV-associated non-Hodgkin lymphoma (NHL) in the era of combination antiretroviral therapy (cART). DESIGN AND SETTING: Multicohort collaboration of 33 European cohorts. METHODS: We included all cART-naive patients enrolled in cohorts participating in the Collaboration of Observational HIV Epidemiological Research Europe (COHERE) who were aged 16 years or older, started cART at some point after 1 January 1998 and developed NHL after 1 January 1998. Patients had to have a CD4 cell count after 1 January 1998 and one at diagnosis of the NHL. Survival and prognostic factors were estimated using Weibull models, with random effects accounting for heterogeneity between cohorts. RESULTS: Of 67 659 patients who were followed up during 304 940 person-years, 1176 patients were diagnosed with NHL. Eight hundred and forty-seven patients (72%) from 22 cohorts met inclusion criteria. Survival at 1 year was 66% [95% confidence interval (CI) 63-70%] for systemic NHL (n = 763) and 54% (95% CI: 43-65%) for primary brain lymphoma (n = 84). Risk factors for death included low nadir CD4 cell counts and a history of injection drug use. Patients developing NHL on cART had an increased risk of death compared with patients who were cART naive at diagnosis. CONCLUSION: In the era of cART two-thirds of patients diagnosed with HIV-related systemic NHL survive for longer than 1 year after diagnosis. Survival is poorer in patients diagnosed with primary brain lymphoma. More advanced immunodeficiency is the dominant prognostic factor for mortality in patients with HIV-related NHL.
Resumo:
OBJECTIVE: To analyse risk factors in alpine skiing. DESIGN: A controlled multicentre survey of injured and non-injured alpine skiers. SETTING: One tertiary and two secondary trauma centres in Bern, Switzerland. PATIENTS AND METHODS: All injured skiers admitted from November 2007 to April 2008 were analysed using a completed questionnaire incorporating 15 parameters. The same questionnaire was distributed to non-injured controls. Multiple logistic regression was performed. Patterns of combined risk factors were calculated by inference trees. A total of 782 patients and 496 controls were interviewed. RESULTS: Parameters that were significant for the patients were: high readiness for risk (p = 0.0365, OR 1.84, 95% CI 1.04 to 3.27); low readiness for speed (p = 0.0008, OR 0.29, 95% CI 0.14 to 0.60); no aggressive behaviour on slopes (p<0.0001, OR 0.19, 95% CI 0.09 to 0.37); new skiing equipment (p = 0.0228, OR 59, 95% CI 0.37 to 0.93); warm-up performed (p = 0.0015, OR 1.79, 95% CI 1.25 to 2.57); old snow compared with fresh snow (p = 0.0155, OR 0.31, 95% CI 0.12 to 0.80); old snow compared with artificial snow (p = 0.0037, OR 0.21, 95% CI 0.07 to 0.60); powder snow compared with slushy snow (p = 0.0035, OR 0.25, 95% CI 0.10 to 0.63); drug consumption (p = 0.0044, OR 5.92, 95% CI 1.74 to 20.11); and alcohol abstinence (p<0.0001, OR 0.14, 95% CI 0.05 to 0.34). Three groups at risk were detected: (1) warm-up 3-12 min, visual analogue scale (VAS)(speed) >4 and bad weather/visibility; (2) VAS(speed) 4-7, icy slopes and not wearing a helmet; (3) warm-up >12 min and new skiing equipment. CONCLUSIONS: Low speed, high readiness for risk, new skiing equipment, old and powder snow, and drug consumption are significant risk factors when skiing. Future work should aim to identify more precisely specific groups at risk and develop recommendations--for example, a snow weather index at valley stations.
Resumo:
Cognitive-perceptive 'basic symptoms' are used complementary to ultra-high-risk criteria in order to predict onset of psychosis in the pre-psychotic phase. The aim was to investigate the prevalence of a broad selection of 'basic symptoms' in a representative general adolescent population sample (GPS; N=96) and to compare it with adolescents first admitted for early onset psychosis (EOP; N=87) or non-psychotic psychiatric disorders (NP; N=137).
Resumo:
Background Left atrium (LA) dilation and P-wave duration are linked to the amount of endurance training and are risk factors for atrial fibrillation (AF). The aim of this study was to evaluate the impact of LA anatomical and electrical remodeling on its conduit and pump function measured by two-dimensional speckle tracking echocardiography (STE). Method Amateur male runners > 30 years were recruited. Study participants (n = 95) were stratified in 3 groups according to lifetime training hours: low (< 1500 h, n = 33), intermediate (1500 to 4500 h, n = 32) and high training group (> 4500 h, n = 30). Results No differences were found, between the groups, in terms of age, blood pressure, and diastolic function. LA maximal volume (30 ± 5, 33 ± 5 vs. 37 ± 6 ml/m2, p < 0.001), and conduit volume index (9 ± 3, 11 ± 3 vs. 12 ± 3 ml/m2, p < 0.001) increased significantly from the low to the high training group, unlike the STE parameters: pump strain − 15.0 ± 2.8, − 14.7 ± 2.7 vs. − 14.9 ± 2.6%, p = 0.927; conduit strain 23.3 ± 3.9, 22.1 ± 5.3 vs. 23.7 ± 5.7%, p = 0.455. Independent predictors of LA strain conduit function were age, maximal early diastolic velocity of the mitral annulus, heart rate and peak early diastolic filling velocity. The signal-averaged P-wave (135 ± 11, 139 ± 10 vs. 148 ± 14 ms, p < 0.001) increased from the low to the high training group. Four episodes of non-sustained AF were recorded in one runner of the high training group. Conclusion The LA anatomical and electrical remodeling does not have a negative impact on atrial mechanical function. Hence, a possible link between these risk factors for AF and its actual, rare occurrence in this athlete population, could not be uncovered in the present study.
Resumo:
OBJECTIVES To identify potential prognostic factors affecting outcome in septic peritonitis caused by gastrointestinal perforation in dogs and cats. METHODS A retrospective study. Animals operated on for septic peritonitis because of gastrointestinal perforation were evaluated. Risk factors assessed included age, duration of clinical signs, recent prior abdominal surgery, recent prior anti-inflammatory drug administration, placement of a closed-suction drain and location of perforation. RESULTS Fifty-five animals (44 dogs and 11 cats) were included. The overall mortality was 63·6%. No association was found between age, duration of clinical signs or prior abdominal surgery and outcome. Animals with a history of prior anti-inflammatory drugs were significantly (P=0·0011) more likely to have perforation of the pylorus (73·3%). No significant difference in outcome was found between animals treated with closed-suction drains and those treated with primary closure or between pyloric perforation and perforation at other gastrointestinal sites. CLINICAL SIGNIFICANCE Administration of anti-inflammatory drugs in dogs and cats is a significant risk factor for pyloric perforation. Pyloric perforation was not associated with a poorer outcome than perforation at other gastrointestinal sites. Placement of a closed suction drain did not improve outcome compared to primary closure.
Resumo:
BACKGROUND Outcome data are limited in patients with ST-segment elevation acute myocardial infarction (STEMI) or other acute coronary syndromes (ACSs) who receive a drug-eluting stent (DES). Data suggest that first generation DES is associated with an increased risk of stent thrombosis when used in STEMI. Whether this observation persists with newer generation DES is unknown. The study objective was to analyze the two-year safety and effectiveness of Resolute™ zotarolimus-eluting stents (R-ZESs) implanted for STEMI, ACS without ST segment elevation (non-STEACS), and stable angina (SA). METHODS Data from the Resolute program (Resolute All Comers and Resolute International) were pooled and patients with R-ZES implantation were categorized by indication: STEMI (n=335), non-STEACS (n=1416), and SA (n=1260). RESULTS Mean age was 59.8±11.3 years (STEMI), 63.8±11.6 (non-STEACS), and 64.9±10.1 (SA). Fewer STEMI patients had diabetes (19.1% vs. 28.5% vs. 29.2%; P<0.001), prior MI (11.3% vs. 27.2% vs. 29.4%; P<0.001), or previous revascularization (11.3% vs. 27.9% vs. 37.6%; P<0.001). Two-year definite/probable stent thrombosis occurred in 2.4% (STEMI), 1.2% (non-STEACS) and 1.1% (SA) of patients with late/very late stent thrombosis (days 31-720) rates of 0.6% (STEMI and non-STEACS) and 0.4% (SA) (P=NS). The two-year mortality rate was 2.1% (STEMI), 4.8% (non-STEACS) and 3.7% (SA) (P=NS). Death or target vessel re-infarction occurred in 3.9% (STEMI), 8.7% (non-STEACS) and 7.3% (SA) (P=0.012). CONCLUSION R-ZES in STEMI and in other clinical presentations is effective and safe. Long term outcomes are favorable with an extremely rare incidence of late and very late stent thrombosis following R-ZES implantation across indications.