27 resultados para 32-310
em BORIS: Bern Open Repository and Information System - Berna - Suiça
Resumo:
ABSTRACT Aim: Intrauterine conditions may interfere with fetal brain development. We compared the neurodevelopmental outcome between infants <32 weeks gestational age after maternal preeclampsia or chorioamnionitis and controls. Methods: Case-control study on infants with maternal preeclampsia, chorioamnionitis and controls (each n = 33) matched for gestational age. Neurodevelopment at two years was assessed with the Bayley Scales of Infant Development II. Results: Ninety-nine infants were included with a median gestational age of 29 weeks (range 25-32). Median mental developmental index (MDI) was 96 in the control, 90 in the chorioamnionitis and 86 in the preeclampsia group. Preeclampsia infants had a lower MDI compared with the control group (univariate p = 0.021, multivariate p = 0.183) and with the chorioamnionitis group (univariate p = 0.242; multivariate p = 0.027). Median psychomotor index was 80.5 in the control, 80 in the preeclampsia and 85 in the chorioamnionitis group, and was not different between these three groups (p > 0.05). Chorioamnionitis or preeclampsia exposure was not associated with major neurodevelopmental impairments (cerebral palsy, MDI<70, PDI<70). Conclusion: The results of this preliminary study suggest that preeclampsia and chorioamnionitis play a relatively minor role among risk factors for adverse neurodevelopment outcome. Postnatal factors such as ventilation and bronchopulmonary dysplasia may have a greater impact on neurodevelopmental outcome.
Resumo:
Background: Several cross-sectional studies during the past 10 years have observed an increased risk of allergic outcomes for children living in damp or mouldy environments. Objective: The objective of this study was to investigate whether reported mould or dampness exposure in early life is associated with the development of allergic disorders in children from eight European birth cohorts. Methods: We analysed data from 31 742 children from eight ongoing European birth cohorts. Exposure to mould and allergic health outcomes were assessed by parental questionnaires at different time points. Meta-analyses with fixed- and random-effect models were applied. The number of the studies included in each analysis varied based on the outcome data available for each cohort. Results: Exposure to visible mould and/or dampness during first 2 years of life was associated with an increased risk of developing asthma: there was a significant association with early asthma symptoms in meta-analyses of four cohorts [0–2 years: adjusted odds ratios (aOR), 1.39 (95%CI, 1.05–1.84)] and with asthma later in childhood in six cohorts [6–8 years: aOR, 1.09(95%CI, 0.90–1.32) and 3–10 years: aOR, 1.10 (95%CI, 0.90–1.34)]. A statistically significant association was observed in six cohorts with symptoms of allergic rhinitis at school age [6–8 years: aOR, 1.12 (1.02–1.23)] and at any time point between 3 and 10 years [aOR, 1.18 (1.09–1.28)]. Conclusion: These findings suggest that a mouldy home environment in early life is associated with an increased risk of asthma particularly in young children and allergic rhinitis symptoms in school-age children.
Resumo:
Correction of prominent ears is a common plastic surgical procedure. We introduced a new non-invasive laser-assisted cartilage reshaping (LACR) technique as an alternative to invasive surgical otoplasty.
Resumo:
BACKGROUND: Plasminogen activator inhibitor type-1 (PAI-1) is considered to be the main inhibitor of fibrinolysis in sepsis. However, the contribution of TAFI to the inhibition of fibrinolysis in sepsis is currently unknown. METHODS: TAFI antigen and PAI-1 levels were measured in severe sepsis (n = 32) and septic shock (n = 8) patients. In addition, TAFI antigen levels had been determined in 151 controls. RESULTS: Septic patients had significantly (p < 0.0001) decreased TAFI levels (median: 78.9% [range: 32.4-172.6]) as compared to controls (108.1% [35.9-255.4]). TAFI levels were equal in septic shock and severe sepsis (68.9% [32.4-172.6] vs. 82.5% [32.7-144.9], p = 0.987) as well as in survivors and non-survivors (87.1% [32.7-172.6] vs. 65.8% [32.4-129.5], p = 0.166). PAI-1 levels were significantly (705.5 ng/ml [131-5788]) higher in septic shock as in severe sepsis patients (316.5 ng/ml [53-1311], p = 0.016) and were equal in survivors and non-survivors (342 ng/ml [53-1311] vs. 413 ng/ml [55-5788], p = 0.231). TAT/PAP ratio (R((TAT/PAP))) reflecting the dysbalance between coagulation and fibrinolysis was calculated. R((TAT/PAP)) significantly increased with fatality and was significantly dependent on PAI-1, but not on TAFI. PAI-1 levels (570.5 ng/ml [135-5788]) and R((TAT/PAP)) (1.6 [0.3-6.1]) were significantly (p = 0.008 and p = 0.047) higher in patients with overt DIC as compared to patients without overt DIC (310 ng/ml [53-1128] and 0.6 [0.1-4.3]), whereas no difference was found for TAFI levels (68.9% [32.7-133.2] vs. 86.4% [32.4-172.6], p = 0.325). CONCLUSIONS: Although inhibition in sepsis is mediated by both, PAI-1 might be involved early in the sepsis process, whereas TAFI might be responsible for ongoing fibrinolysis inhibition in later stages of sepsis.
Resumo:
AIMS: Lesion length remains a predictor of target lesion revascularisation and results of long lesion stenting remain poor. Sirolimus-eluting stents have been shown to perform better than paclitaxel eluting stents in long lesions. In this substudy of the LEADERS trial, we compared the performance of biolimus biodegradable polymer (BES) and sirolimus permanent polymer stents (SES) in long lesions. METHODS AND RESULTS: A total of 1,707 'all-comer' patients were randomly allocated to treatment with BES and SES. A stratified analysis of angiographic and clinical outcomes at nine months and one year, respectively was performed for vessels with lesion length <20 mm versus >20 mm (as measured by quantitative angiography).Of 1,707 patients, 592 BES patients with 831 lesions and 619 SES patients with 876 lesions had only short lesions treated. One hundred and fifty-three BES patients with 166 lesions and 151 SES patients with 162 lesions had long lesions. There were no significant differences in baseline clinical characteristics, except for higher number of patients with long lesions presenting with acute myocardial infarction in both stent groups. Long lesions tended to have lower MLD and greater percent diameter stenosis at baseline than short lesions. Late loss was greater for long lesions than short lesions. There was no statistically significant difference in late loss between BES and SES stents (0.32+/-0.69 vs 0.24+/-0.57, p=0.59). Binary in-segment restenosis was present in 23.2% versus 13.1% of long lesions treated with BES and SES, respectively (p=0.042). In patients with long lesions, the overall MACE rate was similar for BES and SES (17% vs 14.6%; p=0.62). There was a trend towards higher overall TLR rate with BES (12.4 % vs 6.0%; HR=2.06; p=0.07) and clinically driven TLR (10.5% vs 5.3%: HR 1.94; p=0.13). Rates of definite stent thrombosis were 3.3% in the long lesion group and 1.3-1.7 % in the short lesion group. CONCLUSIONS: BES and SES appear similar with respect to MACE in long lesions in this "all-comer" patient population. However, long lesions tended to have a higher rate of binary in-segment restenosis and TLR following BES than SES treatment.
Resumo:
BACKGROUND: The purpose of the study was to investigate allogeneic blood transfusion (ABT) and preoperative anemia as risk factors for surgical site infection (SSI). STUDY DESIGN AND METHODS: A prospective, observational cohort of 5873 consecutive general surgical procedures at Basel University Hospital was analyzed to determine the relationship between perioperative ABT and preoperative anemia and the incidence of SSI. ABT was defined as transfusion of leukoreduced red blood cells during surgery and anemia as hemoglobin concentration of less than 120 g/L before surgery. Surgical wounds and resulting infections were assessed to Centers for Disease Control standards. RESULTS: The overall SSI rate was 4.8% (284 of 5873). In univariable logistic regression analyses, perioperative ABT (crude odds ratio [OR], 2.93; 95% confidence interval [CI], 2.1 to 4.0; p < 0.001) and preoperative anemia (crude OR, 1.32; 95% CI, 1.0 to 1.7; p = 0.037) were significantly associated with an increased odds of SSI. After adjusting for 13 characteristics of the patient and the procedure in multivariable analyses, associations were substantially reduced for ABT (OR, 1.25; 95% CI, 0.8 to 1.9; p = 0.310; OR, 1.07; 95% CI, 0.6 to 2.0; p = 0.817 for 1-2 blood units and >or=3 blood units, respectively) and anemia (OR, 0.91; 95% CI, 0.7 to 1.2; p = 0.530). Duration of surgery was the main confounding variable. CONCLUSION: Our findings point to important confounding factors and strengthen existing doubts on leukoreduced ABT during general surgery and preoperative anemia as risk factors for SSIs.
Resumo:
Daily administration of 2-chlorodeoxyadenosine (Cladribine, CDA) is a standard treatment for hairy cell leukemia, but may cause severe neutropenia and neutropenic fever. This trial compared toxicity and efficacy of weekly versus daily CDA administration. One hundred patients were randomized to receive standard (CDA 0.14 mg/kg/day day 1-5 [Arm A]) or experimental treatment (CDA 0.14 mg/kg/day once weekly for 5 weeks [Arm B]). The primary endpoint was average leukocyte count within 6 weeks from randomization. Secondary endpoints included response rates, other acute hematotoxicity, acute infection rate, hospital admission, remission duration, event-free, and overall survival. There was no significant difference in average leukocyte count. Response rate (complete + partial remission) at week 10 was 78% (95% confidence interval (CI) 64-88%) in Arm A and 68% (95% CI 54-80%) in Arm B (p = 0.13). Best response rates during follow-up were identical (86%) in both arms. No significant difference was found in the rate of grade 3+4 leukocytopenia (94%vs. 84%), grade 3+4 neutropenia (90%vs. 80%), acute infection (44%vs. 40%), hospitalization (38%vs. 34%), and erythrocyte support (22%vs. 30%) within 10 weeks. Overall, these findings indicate that there are no apparent advantages in toxicity and efficacy by giving CDA weekly rather than daily.