17 resultados para 38-0.45 µm carbonate fraction

em BORIS: Bern Open Repository and Information System - Berna - Suiça


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The subject of this study is to investigate the capability of spaceborne remote sensing data to predict ground concentrations of PM10 over the European Alpine region using satellite derived Aerosol Optical Depth (AOD) from the geostationary Spinning Enhanced Visible and InfraRed Imager (SEVIRI) and the polar-orbiting MODerate resolution Imaging Spectroradiometer (MODIS). The spatial and temporal resolutions of these aerosol products (10 km and 2 measurements per day for MODIS, ∼ 25 km and observation intervals of 15 min for SEVIRI) permit an evaluation of PM estimation from space at different spatial and temporal scales. Different empirical linear relationships between coincident AOD and PM10 observations are evaluated at 13 ground-based PM measurement sites, with the assumption that aerosols are vertically homogeneously distributed below the planetary Boundary Layer Height (BLH). The BLH and Relative Humidity (RH) variability are assessed, as well as their impact on the parameterization. The BLH has a strong influence on the correlation of daily and hourly time series, whilst RH effects are less clear and smaller in magnitude. Despite its lower spatial resolution and AOD accuracy, SEVIRI shows higher correlations than MODIS (rSEV∼ 0.7, rMOD∼ 0.6) with regard to daily averaged PM10. Advantages from MODIS arise only at hourly time scales in mountainous locations but lower correlations were found for both sensors at this time scale (r∼ 0.45). Moreover, the fraction of days in 2008 with at least one satellite observation was 27% for SEVIRI and 17% for MODIS. These results suggest that the frequency of observations plays an important role in PM monitoring, while higher spatial resolution does not generally improve the PM estimation. Ground-based Sun Photometer (SP) measurements are used to validate the satellite-based AOD in the study region and to discuss the impact of aerosols' micro-physical properties in the empirical models. A lower error limit of 30 to 60% in the PM10 assessment from space is estimated in the study area as a result of AOD uncertainties, variability of aerosols properties and the heterogeneity of ground measurement sites. It is concluded that SEVIRI has a similar capacity to map PM as sensors on board polar-orbiting platforms, with the advantage of a higher number of observations. However, the accuracy represents a serious limitation to the applicability of satellites for ground PM mapping, especially in mountainous areas.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Ocean acidification from the uptake of anthropogenic carbon is simulated for the industrial period and IPCC SRES emission scenarios A2 and B1 with a global coupled carbon cycle-climate model. Earlier studies identified seawater saturation state with respect to aragonite, a mineral phase of calcium carbonate, as a key variable governing impacts on corals and other shell-forming organisms. Globally in the A2 scenario, water saturated by more than 300%, considered suitable for coral growth, vanishes by 2070 AD (CO2≈630 ppm), and the ocean volume fraction occupied by saturated water decreases from 42% to 25% over this century. The largest simulated pH changes worldwide occur in Arctic surface waters, where hydrogen ion concentration increases by up to 185% (ΔpH=−0.45). Projected climate change amplifies the decrease in Arctic surface mean saturation and pH by more than 20%, mainly due to freshening and increased carbon uptake in response to sea ice retreat. Modeled saturation compares well with observation-based estimates along an Arctic transect and simulated changes have been corrected for remaining model-data differences in this region. Aragonite undersaturation in Arctic surface waters is projected to occur locally within a decade and to become more widespread as atmospheric CO2 continues to grow. The results imply that surface waters in the Arctic Ocean will become corrosive to aragonite, with potentially large implications for the marine ecosystem, if anthropogenic carbon emissions are not reduced and atmospheric CO2 not kept below 450 ppm.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Widespread central hypersensitivity is present in chronic pain and contributes to pain and disability. According to animal studies, expansion of receptive fields of spinal cord neurons is involved in central hypersensitivity. We recently developed a method to quantify nociceptive receptive fields in humans using spinal withdrawal reflexes. Here we hypothesized that patients with chronic pelvic pain display enlarged reflex receptive fields. Secondary endpoints were subjective pain thresholds and nociceptive withdrawal reflex thresholds after single and repeated (temporal summation) electrical stimulation. 20 patients and 25 pain-free subjects were tested. Electrical stimuli were applied to 10 sites on the foot sole for evoking reflexes in the tibialis anterior muscle. The reflex receptive field was defined as the area of the foot (fraction of the foot sole) from which a muscle contraction was evoked. For the secondary endpoints, the stimuli were applied to the cutaneous innervation area of the sural nerve. Medians (25-75 percentiles) of fraction of the foot sole in patients and controls were 0.48 (0.38-0.54) and 0.33 (0.27-0.39), respectively (P=0.008). Pain and reflex thresholds after sural nerve stimulation were significantly lower in patients than in controls (P<0.001 for all measurements). This study provides for the first time evidence for widespread expansion of reflex receptive fields in chronic pain patients. It thereby identifies a mechanism involved in central hypersensitivity in human chronic pain. Reverting the expansion of nociceptive receptive fields and exploring the prognostic meaning of this phenomenon may become future targets of clinical research.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Context Long-term antiretroviral therapy (ART) use in resource-limited countries leads to increasing numbers of patients with HIV taking second-line therapy. Limited access to further therapeutic options makes essential the evaluation of second-line regimen efficacy in these settings. Objectives To investigate failure rates in patients receiving second-line therapy and factors associated with failure and death. Design, Setting, and Participants Multicohort study of 632 patients >14 years old receiving second-line therapy for more than 6 months in 27 ART programs in Africa and Asia between January 2001 and October 2008. Main Outcome Measures Clinical, immunological, virological, and immunovirological failure (first diagnosed episode of immunological or virological failure) rates, and mortality after 6 months of second-line therapy use. Sensitivity analyses were performed using alternative CD4 cell count thresholds for immunological and immunovirological definitions of failure and for cohort attrition instead of death. Results The 632 patients provided 740.7 person-years of follow-up; 119 (18.8%) met World Health Organization failure criteria after a median 11.9 months following the start of second-line therapy (interquartile range [IQR], 8.7-17.0 months), and 34 (5.4%) died after a median 15.1 months (IQR, 11.9-25.7 months). Failure rates were lower in those who changed 2 nucleoside reverse transcriptase inhibitors (NRTIs) instead of 1 (179.2 vs 251.6 per 1000 person-years; incidence rate ratio [IRR], 0.64; 95% confidence interval [CI], 0.42-0.96), and higher in those with lowest adherence index (383.5 vs 176.0 per 1000 person-years; IRR, 3.14; 95% CI, 1.67-5.90 for <80% vs ≥95% [percentage adherent, as represented by percentage of appointments attended with no delay]). Failure rates increased with lower CD4 cell counts when second-line therapy was started, from 156.3 vs 96.2 per 1000 person-years; IRR, 1.59 (95% CI, 0.78-3.25) for 100 to 199/μL to 336.8 per 1000 person-years; IRR, 3.32 (95% CI, 1.81-6.08) for less than 50/μL vs 200/μL or higher; and decreased with time using second-line therapy, from 250.0 vs 123.2 per 1000 person-years; IRR, 1.90 (95% CI, 1.19-3.02) for 6 to 11 months to 212.0 per 1000 person-years; 1.71 (95% CI, 1.01-2.88) for 12 to 17 months vs 18 or more months. Mortality for those taking second-line therapy was lower in women (32.4 vs 68.3 per 1000 person-years; hazard ratio [HR], 0.45; 95% CI, 0.23-0.91); and higher in patients with treatment failure of any type (91.9 vs 28.1 per 1000 person-years; HR, 2.83; 95% CI, 1.38-5.80). Sensitivity analyses showed similar results. Conclusions Among patients in Africa and Asia receiving second-line therapy for HIV, treatment failure was associated with low CD4 cell counts at second-line therapy start, use of suboptimal second-line regimens, and poor adherence. Mortality was associated with diagnosed treatment failure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Exertional oscillatory ventilation (EOV) in heart failure may potentiate the negative effects of low cardiac output and high ventilation on exercise performance. We hypothesized that the presence of EOV might, per se, influence exercise capacity as evaluated by maximal cardiopulmonary exercise test. METHODS AND RESULTS: We identified 78 severe chronic heart failure patient pairs with and without EOV. Patients were matched for sex, age and peak oxygen consumption (VO2). Patients with EOV showed, for the same peak VO2, a lower workload (WL) at peak (DeltaWatts=5.8+/-23.0, P=0.027), a less efficient ventilation (higher VE/VCO2 slope: 38.0+/-8.3 vs. 32.8+/-6.3, P<0.001), lower peak exercise tidal volume (1.49+/-0.36 L vs. 1.61+/-0.46 L, P=0.015) and higher peak respiratory rate (34+/-7/min vs. 31+/-6/min, P=0.002). In 33 patients, EOV disappeared during exercise, whereas in 45 patients EOV persisted. Fifty percent of EOV disappearing patients had an increase in the VO2/WL relationship after EOV regression, consistent with a more efficient oxygen delivery to muscles. No cardiopulmonary exercise test parameter was associated with the different behaviour of VO2/WL. CONCLUSION: The presence of EOV negatively influences exercise performance of chronic heart failure patients likely because of an increased cost of breathing. EOV disappearance during exercise is associated with a more efficient oxygen delivery in several cases.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION: Ultra-high-field whole-body systems (7.0 T) have a high potential for future human in vivo magnetic resonance imaging (MRI). In musculoskeletal MRI, biochemical imaging of articular cartilage may benefit, in particular. Delayed gadolinium-enhanced MRI of cartilage (dGEMRIC) and T2 mapping have shown potential at 3.0 T. Although dGEMRIC, allows the determination of the glycosaminoglycan content of articular cartilage, T2 mapping is a promising tool for the evaluation of water and collagen content. In addition, the evaluation of zonal variation, based on tissue anisotropy, provides an indicator of the nature of cartilage ie, hyaline or hyaline-like articular cartilage.Thus, the aim of our study was to show the feasibility of in vivo dGEMRIC, and T2 and T2* relaxation measurements, at 7.0 T MRI; and to evaluate the potential of T2 and T2* measurements in an initial patient study after matrix-associated autologous chondrocyte transplantation (MACT) in the knee. MATERIALS AND METHODS: MRI was performed on a whole-body 7.0 T MR scanner using a dedicated circular polarization knee coil. The protocol consisted of an inversion recovery sequence for dGEMRIC, a multiecho spin-echo sequence for standard T2 mapping, a gradient-echo sequence for T2* mapping and a morphologic PD SPACE sequence. Twelve healthy volunteers (mean age, 26.7 +/- 3.4 years) and 4 patients (mean age, 38.0 +/- 14.0 years) were enrolled 29.5 +/- 15.1 months after MACT. For dGEMRIC, 5 healthy volunteers (mean age, 32.4 +/- 11.2 years) were included. T1 maps were calculated using a nonlinear, 2-parameter, least squares fit analysis. Using a region-of-interest analysis, mean cartilage relaxation rate was determined as T1 (0) for precontrast measurements and T1 (Gd) for postcontrast gadopentate dimeglumine [Gd-DTPA(2-)] measurements. T2 and T2* maps were obtained using a pixelwise, monoexponential, non-negative least squares fit analysis; region-of-interest analysis was carried out for deep and superficial cartilage aspects. Statistical evaluation was performed by analyses of variance. RESULTS: Mean T1 (dGEMRIC) values for healthy volunteers showed slightly different results for femoral [T1 (0): 1259 +/- 277 ms; T1 (Gd): 683 +/- 141 ms] compared with tibial cartilage [T1 (0): 1093 +/- 281 ms; T1 (Gd): 769 +/- 150 ms]. Global mean T2 relaxation for healthy volunteers showed comparable results for femoral (T2: 56.3 +/- 15.2 ms; T2*: 19.7 +/- 6.4 ms) and patellar (T2: 54.6 +/- 13.0 ms; T2*: 19.6 +/- 5.2 ms) cartilage, but lower values for tibial cartilage (T2: 43.6 +/- 8.5 ms; T2*: 16.6 +/- 5.6 ms). All healthy cartilage sites showed a significant increase from deep to superficial cartilage (P < 0.001). Within healthy cartilage sites in MACT patients, adequate values could be found for T2 (56.6 +/- 13.2 ms) and T2* (18.6 +/- 5.3 ms), which also showed a significant stratification. Within cartilage repair tissue, global mean values showed no difference, with 55.9 +/- 4.9 ms for T2 and 16.2 +/- 6.3 ms for T2*. However, zonal assessment showed only a slight and not significant increase from deep to superficial cartilage (T2: P = 0.174; T2*: P = 0.150). CONCLUSION: In vivo T1 dGEMRIC assessment in healthy cartilage, and T2 and T2* mapping in healthy and reparative articular cartilage, seems to be possible at 7.0 T MRI. For T2 and T2*, zonal variation of articular cartilage could also be evaluated at 7.0 T. This zonal assessment of deep and superficial cartilage aspects shows promising results for the differentiation of healthy and affected articular cartilage. In future studies, optimized protocol selection, and sophisticated coil technology, together with increased signal at ultra-high-field MRI, may lead to advanced biochemical cartilage imaging.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

It is unknown whether transforming growth factor beta1 (TGF-beta1) signaling uniformly participates in fibrogenic chronic liver diseases, irrespective of the underlying origin, or if other cytokines such as interleukin (IL)-13 share in fibrogenesis (e.g., due to regulatory effects on type I pro-collagen expression). TGF-beta1 signaling events were scored in 396 liver tissue samples from patients with diverse chronic liver diseases, including hepatitis B virus (HBV), hepatitis C virus (HCV), Schistosoma japonicum infection, and steatosis/steatohepatitis. Phospho-Smad2 staining correlated significantly with fibrotic stage in patients with HBV infection (n = 112, P < 0.001) and steatosis/steatohepatitis (n = 120, P < 0.01), but not in patients with HCV infection (n = 77, P > 0.05). In tissue with HBx protein expression, phospho-Smad2 was detectable, suggesting a functional link between viral protein expression and TGF-beta1 signaling. For IL-13, immunostaining correlated with fibrotic stage in patients with HCV infection and steatosis/steatohepatitis. IL-13 protein was more abundant in liver tissue lysates from three HCV patients compared with controls, as were IL-13 serum levels in 68 patients with chronic HCV infection compared with 20 healthy volunteers (72.87 +/- 26.38 versus 45.41 +/- 3.73, P < 0.001). Immunohistochemistry results suggest that IL-13-mediated liver fibrogenesis may take place in the absence of phospho-signal transducer and activator of transcription protein 6 signaling. In a subgroup of patients with advanced liver fibrosis (stage > or =3), neither TGF-beta nor IL-13 signaling was detectable. Conclusion: Depending on the cause of liver damage, a predominance of TGF-beta or IL-13 signaling is found. TGF-beta1 predominance is detected in HBV-related liver fibrogenesis and IL-13 predominance in chronic HCV infection. In some instances, the underlying fibrogenic mediator remains enigmatic.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND: Osteoarthritis is the most common form of joint disease and the leading cause of pain and physical disability in the elderly. Opioids may be a viable treatment option if patients suffer from severe pain or if other analgesics are contraindicated. However, the evidence about their effectiveness and safety is contradictory. OBJECTIVES: To determine the effects on pain and function and the safety of oral or transdermal opioids as compared with placebo or no intervention in patients with osteoarthritis of the hip or knee. SEARCH STRATEGY: We searched CENTRAL, MEDLINE, EMBASE, and CINAHL (up to 28 July 2008), checked conference proceedings, reference lists, and contacted authors. SELECTION CRITERIA: Studies were included if they were randomised or quasi-randomised controlled trials that compared oral or transdermal opioids with placebo or no treatment in patients with osteoarthritis of the knee or hip. Studies of tramadol were excluded. No language restrictions were applied. DATA COLLECTION AND ANALYSIS: We extracted data in duplicate. Standardised mean differences (SMDs) and 95% confidence intervals (CI) were calculated for pain and function, and risk ratios for safety outcomes. Trials were combined using inverse-variance random-effects meta-analysis. MAIN RESULTS: Ten trials with 2268 participants were included. Oral codeine was studied in three trials, transdermal fentanyl and oral morphine in one trial each, oral oxycodone in four, and oral oxymorphone in two trials. Overall, opioids were more effective than control interventions in terms of pain relief (SMD -0.36, 95% CI -0.47 to -0.26) and improvement of function (SMD -0.33, 95% CI -0.45 to -0.21). We did not find substantial differences in effects according to type of opioid, analgesic potency (strong or weak), daily dose, duration of treatment or follow up, methodological quality of trials, and type of funding. Adverse events were more frequent in patients receiving opioids compared to control. The pooled risk ratio was 1.55 (95% CI 1.41 to 1.70) for any adverse event (4 trials), 4.05 (95% CI 3.06 to 5.38) for dropouts due to adverse events (10 trials), and 3.35 (95% CI 0.83 to 13.56) for serious adverse events (2 trials). Withdrawal symptoms were more severe after fentanyl treatment compared to placebo (SMD 0.60, 95% CI 0.42 to 0.79; 1 trial). AUTHORS' CONCLUSIONS: The small to moderate beneficial effects of non-tramadol opioids are outweighed by large increases in the risk of adverse events. Non-tramadol opioids should therefore not be routinely used, even if osteoarthritic pain is severe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We compared atorvastatin with simvastatin-based therapies in a prospective observational study of 201 patients with severe hyperlipidaemia. Atorvastatin 10 mg therapy was substituted for simvastatin 20 mg, 20 mg for 40 mg, 40 mg for simvastatin 40 mg plus resin, and 80 mg for simvastatin-fibrate-resin therapy. Lipid and safety profiles were assessed. Atorvastatin reduced total cholesterol by 31 +/- 11-40 +/- 14% vs. 25 +/- 12-31 +/- 11%; LDL by 38 +/- 16-45 +/- 18% vs. 31 +/- 18-39 +/- 18% and geometric mean triglycerides by 29.3-37.3% vs. 16.6-24.8%, but reduced HDL 11% +/- 47% at 80 mg compared with a 16% +/- 34% increase with simvastatin-based therapy. Target LDL < 3.5 mmol/l was achieved more often with atorvastatin (63% vs. 50%; p < 0.001). Atorvastatin increased geometric mean fibrinogen by 12-20% vs. a 0-6% fall with simvastatin (p << 0.001). Side effects were noted in 10-36% of patients, including one case of rhabdomyolysis, and 36% discontinued therapy. These data suggest that atorvastatin is more effective than current simvastatin-based therapies in achieving treatment targets in patients with familial hypercholesterolaemia but at the expense of a possible increase in side-effects. This issue needs further study in randomized controlled trials.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In sport psychology research about emotional contagion in sport teams has been scarce (Reicherts & Horn, 2008). Emotional contagion is a process leading to a specific emotional state in an individual caused by the perception of another individual’s emotional expression (Hatfield, Cacioppo & Rapson, 1994). Apitzsch (2009) described emotional contagion as one reason for collapsing sport teams. The present study examined the occurrence of emotional contagion in dyads during a basketball task and the impact of a socially induced emotional state on performance. An experiment with between-subjects design was conducted. Participants (N=81, ♀=38, M=21.33 years, SD=1.45) were randomly assigned to one of two experimental conditions, by joining a confederate to compose a same gender, ad hoc team. The team was instructed to perform a basketball task as quickly as possible. The between-factor of the experimental design was the confederate’s emotional expression (positive or negative valence). The within-factor was participants’ emotional state, measured pre- and post-experimentally using PANAS (Krohne, Egloff, Kohlmann & Tausch, 1996). The basketball task was video-taped and the number of frames participants needed to complete the task was used to determine the individual performance. The confederate’s emotional expression was appraised in a significantly different manner across both experimental conditions by participants and video raters (MC). Mixed between-within subjects ANOVAs were conducted to examine the impact of the two conditions on participants’ scores on the PANAS subscales across two time periods (pre- and post-experimental). No significant interaction effects but substantial main effects for time were found on both PANAS subscales. Both groups showed an increase in positive and a reduction in negative PANAS scores across these two time periods. Nevertheless, video raters assessment of the emotional states expressed by participants was significantly different between the positive (M=3.23, SD=0.45) and negative condition (M=2.39, SD=0.53; t=7.64, p<.001, eta squared=.43). An independent-samples t-test indicated no difference in performance between conditions. Furthermore, no significant correlation between the extent of positive or negative emotional contagion and the number of frames was observed. The basketball task lead to an improvement of the emotional state of participants, independently of the condition. Even though participants PANAS scores indicated a tendency to emotional contagion, it was not statistically significant. This could be explained by the low task duration of approximately three minutes. Moreover, the performance of participants was unaffected by the experimental condition or the extent of positive or negative emotional contagion. Apitzsch, E. (2009). A case study of a collapsing handball team. In S. Jern & J. Näslund (Eds.), Dynamics within and outside the lab. Proceedings from The 6th Nordic Conference on Group and Social Psychology, May 2008, Lund, pp. 35-52. Hatfield, E., Cacioppo, J. T. & Rapson, R. L. (1994). Emotional contagion. Cambridge: University Press. Krohne, H. W., Egloff, B., Kohlmann, C.-W. & Tausch, A. (1996). Untersuchungen mit einer deutschen Version der „Positive und Negative Affect Schedule“ (PANAS). Diagnostica, 42 (2), 139-156. Reicherts, M. & Horn, A. B. (2008). Emotionen im Sport. In W. Schlicht & B. Strauss (Eds.), Enzyklopädie der Psychologie. Grundlagen der Sportpsychologie (Bd. 1) (S. 563-633). Göttingen: Hogrefe.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Intracoronary administration of autologous bone marrow-derived mononuclear cells (BM-MNC) may improve remodeling of the left ventricle (LV) after acute myocardial infarction. The optimal time point of administration of BM-MNC is still uncertain and has rarely been addressed prospectively in randomized clinical trials. METHODS AND RESULTS In a multicenter study, we randomized 200 patients with large, successfully reperfused ST-segment elevation myocardial infarction in a 1:1:1 pattern into an open-labeled control and 2 BM-MNC treatment groups. In the BM-MNC groups, cells were administered either early (i.e., 5 to 7 days) or late (i.e., 3 to 4 weeks) after acute myocardial infarction. Cardiac magnetic resonance imaging was performed at baseline and after 4 months. The primary end point was the change from baseline to 4 months in global LV ejection fraction between the 2 treatment groups and the control group. The absolute change in LV ejection fraction from baseline to 4 months was -0.4±8.8% (mean±SD; P=0.74 versus baseline) in the control group, 1.8±8.4% (P=0.12 versus baseline) in the early group, and 0.8±7.6% (P=0.45 versus baseline) in the late group. The treatment effect of BM-MNC as estimated by ANCOVA was 1.25 (95% confidence interval, -1.83 to 4.32; P=0.42) for the early therapy group and 0.55 (95% confidence interval, -2.61 to 3.71; P=0.73) for the late therapy group. CONCLUSIONS Among patients with ST-segment elevation myocardial infarction and LV dysfunction after successful reperfusion, intracoronary infusion of BM-MNC at either 5 to 7 days or 3 to 4 weeks after acute myocardial infarction did not improve LV function at 4-month follow-up.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Open radical cystectomy (ORC) is associated with substantial blood loss and a high incidence of perioperative blood transfusions. Strategies to reduce blood loss and blood transfusion are warranted. OBJECTIVE To determine whether continuous norepinephrine administration combined with intraoperative restrictive hydration with Ringer's maleate solution can reduce blood loss and the need for blood transfusion. DESIGN, SETTING, AND PARTICIPANTS This was a double-blind, randomised, parallel-group, single-centre trial including 166 consecutive patients undergoing ORC with urinary diversion (UD). Exclusion criteria were severe hepatic or renal dysfunction, congestive heart failure, and contraindications to epidural analgesia. INTERVENTION Patients were randomly allocated to continuous norepinephrine administration starting with 2 μg/kg per hour combined with 1 ml/kg per hour until the bladder was removed, then to 3 ml/kg per hour of Ringer's maleate solution (norepinephrine/low-volume group) or 6 ml/kg per hour of Ringer's maleate solution throughout surgery (control group). OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Intraoperative blood loss and the percentage of patients requiring blood transfusions perioperatively were assessed. Data were analysed using nonparametric statistical models. RESULTS AND LIMITATIONS Total median blood loss was 800 ml (range: 300-1700) in the norepinephrine/low-volume group versus 1200 ml (range: 400-2800) in the control group (p<0.0001). In the norepinephrine/low-volume group, 27 of 83 patients (33%) required an average of 1.8 U (±0.8) of packed red blood cells (PRBCs). In the control group, 50 of 83 patients (60%) required an average of 2.9 U (±2.1) of PRBCs during hospitalisation (relative risk: 0.54; 95% confidence interval [CI], 0.38-0.77; p=0.0006). The absolute reduction in transfusion rate throughout hospitalisation was 28% (95% CI, 12-45). In this study, surgery was performed by three high-volume surgeons using a standardised technique, so whether these significant results are reproducible in other centres needs to be shown. CONCLUSIONS Continuous norepinephrine administration combined with restrictive hydration significantly reduces intraoperative blood loss, the rate of blood transfusions, and the number of PRBC units required per patient undergoing ORC with UD.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Non-AIDS defining cancers (NADC) are an important cause of morbidity and mortality in HIV-positive individuals. Using data from a large international cohort of HIV-positive individuals, we described the incidence of NADC from 2004–2010, and described subsequent mortality and predictors of these. Methods Individuals were followed from 1st January 2004/enrolment in study, until the earliest of a new NADC, 1st February 2010, death or six months after the patient’s last visit. Incidence rates were estimated for each year of follow-up, overall and stratified by gender, age and mode of HIV acquisition. Cumulative risk of mortality following NADC diagnosis was summarised using Kaplan-Meier methods, with follow-up for these analyses from the date of NADC diagnosis until the patient’s death, 1st February 2010 or 6 months after the patient’s last visit. Factors associated with mortality following NADC diagnosis were identified using multivariable Cox proportional hazards regression. Results Over 176,775 person-years (PY), 880 (2.1%) patients developed a new NADC (incidence: 4.98/1000PY [95% confidence interval 4.65, 5.31]). Over a third of these patients (327, 37.2%) had died by 1st February 2010. Time trends for lung cancer, anal cancer and Hodgkin’s lymphoma were broadly consistent. Kaplan-Meier cumulative mortality estimates at 1, 3 and 5 years after NADC diagnosis were 28.2% [95% CI 25.1-31.2], 42.0% [38.2-45.8] and 47.3% [42.4-52.2], respectively. Significant predictors of poorer survival after diagnosis of NADC were lung cancer (compared to other cancer types), male gender, non-white ethnicity, and smoking status. Later year of diagnosis and higher CD4 count at NADC diagnosis were associated with improved survival. The incidence of NADC remained stable over the period 2004–2010 in this large observational cohort. Conclusions The prognosis after diagnosis of NADC, in particular lung cancer and disseminated cancer, is poor but has improved somewhat over time. Modifiable risk factors, such as smoking and low CD4 counts, were associated with mortality following a diagnosis of NADC.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study reports the chemical composition of particles present along Greenland’s North Greenland Eemian Ice Drilling (NEEM) ice core, back to 110,000 years before present. Insoluble and soluble particles larger than 0.45 μm were extracted from the ice core by ice sublimation, and their chemical composition was analyzed using scanning electron microscope and energy dispersive X-ray spectroscopy and micro-Raman spectroscopy. We show that the dominant insoluble components are silicates, whereas NaCl, Na₂SO₄, CaSO ₄, and CaCO₃ represent major soluble salts. For the first time, particles of CaMg(CO₃)₂ and Ca(NO₃)₂ 4H₂O are identified in a Greenland ice core. The chemical speciation of salts varies with past climatic conditions. Whereas the fraction of Na salts (NaCl + Na₂SO₄) exceeds that of Ca salts (CaSO₄+ CaCO₃) during the Holocene (0.6–11.7 kyr B.P.), the two fractions are similar during the Bølling-Allerød period (12.9–14.6 kyr B.P.). During cold climate such as over the Younger Dryas (12.0–12.6 kyr B.P.) and the Last Glacial Maximum (15.0–26.9 kyr B.P.), the fraction of Ca salts exceeds that of Na salts, showing that the most abundant ion generally controls the salt budget in each period. High-resolution analyses reveal changing particle compositions: those in Holocene ice show seasonal changes, and those in LGM ice show a difference between cloudy bands and clear layers, which again can be largely explained by the availability of ionic components in the atmospheric aerosol body of air masses reaching Greenland.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES This study compared clinical outcomes and revascularization strategies among patients presenting with low ejection fraction, low-gradient (LEF-LG) severe aortic stenosis (AS) according to the assigned treatment modality. BACKGROUND The optimal treatment modality for patients with LEF-LG severe AS and concomitant coronary artery disease (CAD) requiring revascularization is unknown. METHODS Of 1,551 patients, 204 with LEF-LG severe AS (aortic valve area <1.0 cm(2), ejection fraction <50%, and mean gradient <40 mm Hg) were allocated to medical therapy (MT) (n = 44), surgical aortic valve replacement (SAVR) (n = 52), or transcatheter aortic valve replacement (TAVR) (n = 108). CAD complexity was assessed using the SYNTAX score (SS) in 187 of 204 patients (92%). The primary endpoint was mortality at 1 year. RESULTS LEF-LG severe AS patients undergoing SAVR were more likely to undergo complete revascularization (17 of 52, 35%) compared with TAVR (8 of 108, 8%) and MT (0 of 44, 0%) patients (p < 0.001). Compared with MT, both SAVR (adjusted hazard ratio [adj HR]: 0.16; 95% confidence interval [CI]: 0.07 to 0.38; p < 0.001) and TAVR (adj HR: 0.30; 95% CI: 0.18 to 0.52; p < 0.001) improved survival at 1 year. In TAVR and SAVR patients, CAD severity was associated with higher rates of cardiovascular death (no CAD: 12.2% vs. low SS [0 to 22], 15.3% vs. high SS [>22], 31.5%; p = 0.037) at 1 year. Compared with no CAD/complete revascularization, TAVR and SAVR patients undergoing incomplete revascularization had significantly higher 1-year cardiovascular death rates (adj HR: 2.80; 95% CI: 1.07 to 7.36; p = 0.037). CONCLUSIONS Among LEF-LG severe AS patients, SAVR and TAVR improved survival compared with MT. CAD severity was associated with worse outcomes and incomplete revascularization predicted 1-year cardiovascular mortality among TAVR and SAVR patients.