952 resultados para log-ratio analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE: To compare clinical outcomes of endovascular and open aortic repair of abdominal aortic aneurysms (AAAs) in young patients at low risk. It was hypothesized that endovascular aneurysm repair (EVAR) compares favorably with open aneurysm repair (OAR) in these patients. MATERIALS AND METHODS: Twenty-five patients aged 65 years or younger with a low perioperative surgical risk profile underwent EVAR at a single institution between April 1994 and May 2007 (23 men; mean age, 62 years+/-2.8). A sex- and risk-matched control group of 25 consecutive patients aged 65 years or younger who underwent OAR was used as a control group (23 men; mean age, 59 years+/-3.9). Patient outcomes and complications were classified according to Society of Vascular Surgery/International Society for Cardiovascular Surgery reporting standards. RESULTS: Mean follow-up times were 7.1 years+/-3.2 after EVAR and 5.9 years+/-1.8 after OAR (P=.1020). Total complication rates were 20% after EVAR and 52% after OAR (P=.0378), and all complications were mild or moderate. Mean intensive care unit times were 0.2 days+/-0.4 after EVAR and 1.1 days+/-0.4 after OAR (P<.0001) and mean lengths of hospital stay were 2.3 days+/-1.0 after EVAR and 5.0 days+/-2.1 after OAR (P<.0001). Cumulative rates of long-term patient survival did not differ between EVAR and OAR (P=.144). No AAA-related deaths or aortoiliac ruptures occurred during follow-up for EVAR and OAR. In addition, no surgical conversions were necessary in EVAR recipients. Cumulative rates of freedom from secondary procedures were not significantly different between the EVAR and OAR groups (P=.418). Within a multivariable Cox proportional-hazards analysis adjusted for patient age, maximum AAA diameter, and cardiac risk score, all-cause mortality rates (odds ratio [OR], 0.125; 95% CI, 0.010-1.493; P=.100) and need for secondary procedures (OR, 5.014; 95% CI, 0.325-77.410; P=.537) were not different between EVAR and OAR. CONCLUSIONS: Results from this observational study indicate that EVAR offers a favorable alternative to OAR in young patients at low risk.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This study focuses on a specific engine, i.e., a dual-spool, separate-flow turbofan engine with an Interstage Turbine Burner (ITB). This conventional turbofan engine has been modified to include a secondary isobaric burner, i.e., ITB, in a transition duct between the high-pressure turbine and the low-pressure turbine. The preliminary design phase for this modified engine starts with the aerothermodynamics cycle analysis is consisting of parametric (i.e., on-design) and performance (i.e., off-design) cycle analyses. In parametric analysis, the modified engine performance parameters are evaluated and compared with baseline engine in terms of design limitation (maximum turbine inlet temperature), flight conditions (such as flight Mach condition, ambient temperature and pressure), and design choices (such as compressor pressure ratio, fan pressure ratio, fan bypass ratio etc.). A turbine cooling model is also included to account for the effect of cooling air on engine performance. The results from the on-design analysis confirmed the advantage of using ITB, i.e., higher specific thrust with small increases in thrust specific fuel consumption, less cooling air, and less NOx production, provided that the main burner exit temperature and ITB exit temperature are properly specified. It is also important to identify the critical ITB temperature, beyond which the ITB is turned off and has no advantage at all. With the encouraging results from parametric cycle analysis, a detailed performance cycle analysis of the identical engine is also conducted for steady-stateengine performance prediction. The results from off-design cycle analysis show that the ITB engine at full throttle setting has enhanced performance over baseline engine. Furthermore, ITB engine operating at partial throttle settings will exhibit higher thrust at lower specific fuel consumption and improved thermal efficiency over the baseline engine. A mission analysis is also presented to predict the fuel consumptions in certain mission phases. Excel macrocode, Visual Basic for Application, and Excel neuron cells are combined to facilitate Excel software to perform these cycle analyses. These user-friendly programs compute and plot the data sequentially without forcing users to open other types of post-processing programs.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Demand for bio-fuels is expected to increase, due to rising prices of fossil fuels and concerns over greenhouse gas emissions and energy security. The overall cost of biomass energy generation is primarily related to biomass harvesting activity, transportation, and storage. With a commercial-scale cellulosic ethanol processing facility in Kinross Township of Chippewa County, Michigan about to be built, models including a simulation model and an optimization model have been developed to provide decision support for the facility. Both models track cost, emissions and energy consumption. While the optimization model provides guidance for a long-term strategic plan, the simulation model aims to present detailed output for specified operational scenarios over an annual period. Most importantly, the simulation model considers the uncertainty of spring break-up timing, i.e., seasonal road restrictions. Spring break-up timing is important because it will impact the feasibility of harvesting activity and the time duration of transportation restrictions, which significantly changes the availability of feedstock for the processing facility. This thesis focuses on the statistical model of spring break-up used in the simulation model. Spring break-up timing depends on various factors, including temperature, road conditions and soil type, as well as individual decision making processes at the county level. The spring break-up model, based on the historical spring break-up data from 27 counties over the period of 2002-2010, starts by specifying the probability distribution of a particular county’s spring break-up start day and end day, and then relates the spring break-up timing of the other counties in the harvesting zone to the first county. In order to estimate the dependence relationship between counties, regression analyses, including standard linear regression and reduced major axis regression, are conducted. Using realizations (scenarios) of spring break-up generated by the statistical spring breakup model, the simulation model is able to probabilistically evaluate different harvesting and transportation plans to help the bio-fuel facility select the most effective strategy. For early spring break-up, which usually indicates a longer than average break-up period, more log storage is required, total cost increases, and the probability of plant closure increases. The risk of plant closure may be partially offset through increased use of rail transportation, which is not subject to spring break-up restrictions. However, rail availability and rail yard storage may then become limiting factors in the supply chain. Rail use will impact total cost, energy consumption, system-wide CO2 emissions, and the reliability of providing feedstock to the bio-fuel processing facility.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: In HIV type-1-infected patients starting highly active antiretroviral therapy (HAART), the prognostic value of haemoglobin when starting HAART, and of changes in haemoglobin levels, are not well defined. METHODS: We combined data from 10 prospective studies of 12,100 previously untreated individuals (25% women). A total of 4,222 patients (35%) were anaemic: 131 patients (1.1%) had severe (<8.0 g/dl), 1,120 (9%) had moderate (male 8.0-<11.0 g/dl and female 8.0- < 10.0 g/dl) and 2,971 (25%) had mild (male 11.0- < 13.0 g/ dl and female 10.0- < 12.0 g/dl) anaemia. We separately analysed progression to AIDS or death from baseline and from 6 months using Weibull models, adjusting for CD4+ T-cell count, age, sex and other variables. RESULTS: During 48,420 person-years of follow-up 1,448 patients developed at least one AIDS event and 857 patients died. Anaemia at baseline was independently associated with higher mortality: the adjusted hazard ratio (95% confidence interval) for mild anaemia was 1.42 (1.17-1.73), for moderate anaemia 2.56 (2.07-3.18) and for severe anaemia 5.26 (3.55-7.81). Corresponding figures for progression to AIDS were 1.60 (1.37-1.86), 2.00 (1.66-2.40) and 2.24 (1.46-3.42). At 6 months the prevalence of anaemia declined to 26%. Baseline anaemia continued to predict mortality (and to a lesser extent progression to AIDS) in patients with normal haemoglobin or mild anaemia at 6 months. CONCLUSIONS: Anaemia at the start of HAART is an important factor for short- and long-term prognosis, including in patients whose haemoglobin levels improved or normalized during the first 6 months of HAART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Multidimensional preventive home visit programs aim at maintaining health and autonomy of older adults and preventing disability and subsequent nursing home admission, but results of randomized controlled trials (RCTs) have been inconsistent. Our objective was to systematically review RCTs examining the effect of home visit programs on mortality, nursing home admissions, and functional status decline. METHODS: Data sources were MEDLINE, EMBASE, Cochrane CENTRAL database, and references. Studies were reviewed to identify RCTs that compared outcome data of older participants in preventive home visit programs with control group outcome data. Publications reporting 21 trials were included. Data on study population, intervention characteristics, outcomes, and trial quality were double-extracted. We conducted random effects meta-analyses. RESULTS: Pooled effects estimates revealed statistically nonsignificant favorable, and heterogeneous effects on mortality (odds ratio [OR] 0.92, 95% confidence interval [CI], 0.80-1.05), functional status decline (OR 0.89, 95% CI, 0.77-1.03), and nursing home admission (OR 0.86, 95% CI, 0.68-1.10). A beneficial effect on mortality was seen in younger study populations (OR 0.74, 95% CI, 0.58-0.94) but not in older populations (OR 1.14, 95% CI, 0.90-1.43). Functional decline was reduced in programs including a clinical examination in the initial assessment (OR 0.64, 95% CI, 0.48-0.87) but not in other trials (OR 1.00, 95% CI, 0.88-1.14). There was no single factor explaining the heterogenous effects of trials on nursing home admissions. CONCLUSION: Multidimensional preventive home visits have the potential to reduce disability burden among older adults when based on multidimensional assessment with clinical examination. Effects on nursing home admissions are heterogeneous and likely depend on multiple factors including population factors, program characteristics, and health care setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The number of record-breaking events expected to occur in a strictly stationary time-series depends only on the number of values in the time-series, regardless of distribution. This holds whether the events are record-breaking highs or lows and whether we count from past to present or present to past. However, these symmetries are broken in distinct ways by trends in the mean and variance. We define indices that capture this information and use them to detect weak trends from multiple time-series. Here, we use these methods to answer the following questions: (1) Is there a variability trend among globally distributed surface temperature time-series? We find a significant decreasing variability over the past century for the Global Historical Climatology Network (GHCN). This corresponds to about a 10% change in the standard deviation of inter-annual monthly mean temperature distributions. (2) How are record-breaking high and low surface temperatures in the United States affected by time period? We investigate the United States Historical Climatology Network (USHCN) and find that the ratio of record-breaking highs to lows in 2006 increases as the time-series extend further into the past. When we consider the ratio as it evolves with respect to a fixed start year, we find it is strongly correlated with the ensemble mean. We also compare the ratios for USHCN and GHCN (minus USHCN stations). We find the ratios grow monotonically in the GHCN data set, but not in the USHCN data set. (3) Do we detect either mean or variance trends in annual precipitation within the United States? We find that the total annual and monthly precipitation in the United States (USHCN) has increased over the past century. Evidence for a trend in variance is inconclusive.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Reports on the effects of focal hemispheric damage on sleep EEG are rare and contradictory. PATIENTS AND METHODS: Twenty patients (mean age +/- SD 53 +/- 14 years) with a first acute hemispheric stroke and no sleep apnea were studied. Stroke severity [National Institute of Health Stroke Scale (NIHSS)], volume (diffusion-weighted brain MRI), and short-term outcome (Rankin score) were assessed. Within the first 8 days after stroke onset, 1-3 sleep EEG recordings per patient were performed. Sleep scoring and spectral analysis were based on the central derivation of the healthy hemisphere. Data were compared with those of 10 age-matched and gender-matched hospitalized controls with no brain damage and no sleep apnea. RESULTS: Stroke patients had higher amounts of wakefulness after sleep onset (112 +/- 53 min vs. 60 +/- 38 min, p < 0.05) and a lower sleep efficiency (76 +/- 10% vs. 86 +/- 8%, p < 0.05) than controls. Time spent in slow-wave sleep (SWS) and rapid eye movement (REM) sleep and total sleep time were lower in stroke patients, but differences were not significant. A positive correlation was found between the amount of SWS and stroke volume (r = 0.79). The slow-wave activity (SWA) ratio NREM sleep/wakefulness was lower in patients than in controls (p < 0.05), and correlated with NIHSS (r = -0.47). CONCLUSION: Acute hemispheric stroke is accompanied by alterations of sleep EEG over the healthy hemisphere that correlate with stroke volume and outcome. The increased SWA during wakefulness and SWS over the healthy hemisphere contralaterally to large strokes may reflect neuronal hypometabolism induced transhemispherically (diaschisis).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Erythropoiesis-stimulating agents (ESAs) reduce anemia in cancer patients and may improve quality of life, but there are concerns that ESAs might increase mortality. OBJECTIVES: Our objectives were to examine the effect of ESAs and identify factors that modify the effects of ESAs on overall survival, progression free survival, thromboembolic and cardiovascular events as well as need for transfusions and other important safety and efficacy outcomes in cancer patients. SEARCH STRATEGY: We searched the Cochrane Library, Medline, Embase and conference proceedings for eligible trials. Manufacturers of ESAs were contacted to identify additional trials. SELECTION CRITERIA: We included randomized controlled trials comparing epoetin or darbepoetin plus red blood cell transfusions (as necessary) versus red blood cell transfusions (as necessary) alone, to prevent or treat anemia in adult or pediatric cancer patients with or without concurrent antineoplastic therapy. DATA COLLECTION AND ANALYSIS: We performed a meta-analysis of randomized controlled trials comparing epoetin alpha, epoetin beta or darbepoetin alpha plus red blood cell transfusions versus transfusion alone, for prophylaxis or therapy of anemia while or after receiving anti-cancer treatment. Patient-level data were obtained and analyzed by independent statisticians at two academic departments, using fixed-effects and random-effects meta-analysis. Analyses were according to the intention-to-treat principle. Primary endpoints were on study mortality and overall survival during the longest available follow-up, regardless of anticancer treatment, and in patients receiving chemotherapy. Tests for interactions were used to identify differences in effects of ESAs on mortality across pre-specified subgroups. The present review reports only the results for the primary endpoint. MAIN RESULTS: A total of 13933 cancer patients from 53 trials were analyzed, 1530 patients died on-study and 4993 overall. ESAs increased on study mortality (combined hazard ratio [cHR] 1.17; 95% CI 1.06-1.30) and worsened overall survival (cHR 1.06; 95% CI 1.00-1.12), with little heterogeneity between trials (I(2) 0%, p=0.87 and I(2) 7.1%, p=0.33, respectively). Thirty-eight trials enrolled 10441 patients receiving chemotherapy. The cHR for on study mortality was 1.10 (95% CI 0.98-1.24) and 1.04; 95% CI 0.97-1.11) for overall survival. There was little evidence for a difference between trials of patients receiving different cancer treatments (P for interaction=0.42). AUTHORS' CONCLUSIONS: ESA treatment in cancer patients increased on study mortality and worsened overall survival. For patients undergoing chemotherapy the increase was less pronounced, but an adverse effect could not be excluded.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Erythropoiesis-stimulating agents reduce anaemia in patients with cancer and could improve their quality of life, but these drugs might increase mortality. We therefore did a meta-analysis of randomised controlled trials in which these drugs plus red blood cell transfusions were compared with transfusion alone for prophylaxis or treatment of anaemia in patients with cancer. METHODS: Data for patients treated with epoetin alfa, epoetin beta, or darbepoetin alfa were obtained and analysed by independent statisticians using fixed-effects and random-effects meta-analysis. Analyses were by intention to treat. Primary endpoints were mortality during the active study period and overall survival during the longest available follow-up, irrespective of anticancer treatment, and in patients given chemotherapy. Tests for interactions were used to identify differences in effects of erythropoiesis-stimulating agents on mortality across prespecified subgroups. FINDINGS: Data from a total of 13 933 patients with cancer in 53 trials were analysed. 1530 patients died during the active study period and 4993 overall. Erythropoiesis-stimulating agents increased mortality during the active study period (combined hazard ratio [cHR] 1.17, 95% CI 1.06-1.30) and worsened overall survival (1.06, 1.00-1.12), with little heterogeneity between trials (I(2) 0%, p=0.87 for mortality during the active study period, and I(2) 7.1%, p=0.33 for overall survival). 10 441 patients on chemotherapy were enrolled in 38 trials. The cHR for mortality during the active study period was 1.10 (0.98-1.24), and 1.04 (0.97-1.11) for overall survival. There was little evidence for a difference between trials of patients given different anticancer treatments (p for interaction=0.42). INTERPRETATION: Treatment with erythropoiesis-stimulating agents in patients with cancer increased mortality during active study periods and worsened overall survival. The increased risk of death associated with treatment with these drugs should be balanced against their benefits. FUNDING: German Federal Ministry of Education and Research, Medical Faculty of University of Cologne, and Oncosuisse (Switzerland).

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The CD4 cell count at which combination antiretroviral therapy should be started is a central, unresolved issue in the care of HIV-1-infected patients. In the absence of randomised trials, we examined this question in prospective cohort studies. METHODS: We analysed data from 18 cohort studies of patients with HIV. Antiretroviral-naive patients from 15 of these studies were eligible for inclusion if they had started combination antiretroviral therapy (while AIDS-free, with a CD4 cell count less than 550 cells per microL, and with no history of injecting drug use) on or after Jan 1, 1998. We used data from patients followed up in seven of the cohorts in the era before the introduction of combination therapy (1989-95) to estimate distributions of lead times (from the first CD4 cell count measurement in an upper range to the upper threshold of a lower range) and unseen AIDS and death events (occurring before the upper threshold of a lower CD4 cell count range is reached) in the absence of treatment. These estimations were used to impute completed datasets in which lead times and unseen AIDS and death events were added to data for treated patients in deferred therapy groups. We compared the effect of deferred initiation of combination therapy with immediate initiation on rates of AIDS and death, and on death alone, in adjacent CD4 cell count ranges of width 100 cells per microL. FINDINGS: Data were obtained for 21 247 patients who were followed up during the era before the introduction of combination therapy and 24 444 patients who were followed up from the start of treatment. Deferring combination therapy until a CD4 cell count of 251-350 cells per microL was associated with higher rates of AIDS and death than starting therapy in the range 351-450 cells per microL (hazard ratio [HR] 1.28, 95% CI 1.04-1.57). The adverse effect of deferring treatment increased with decreasing CD4 cell count threshold. Deferred initiation of combination therapy was also associated with higher mortality rates, although effects on mortality were less marked than effects on AIDS and death (HR 1.13, 0.80-1.60, for deferred initiation of treatment at CD4 cell count 251-350 cells per microL compared with initiation at 351-450 cells per microL). INTERPRETATION: Our results suggest that 350 cells per microL should be the minimum threshold for initiation of antiretroviral therapy, and should help to guide physicians and patients in deciding when to start treatment.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: Excess body weight, defined by body mass index (BMI), may increase the risk of colorectal cancer. As a prerequisite to the determination of lifestyle attributable risks, we undertook a systematic review and meta-analysis of prospective observational studies to quantify colorectal cancer risk associated with increased BMI and explore for differences by gender, sub-site and study characteristics. METHOD: We searched MEDLINE and EMBASE (to December 2007), and other sources, selecting reports based on strict inclusion criteria. Random-effects meta-analyses and meta-regressions of study-specific incremental estimates were performed to determine the risk ratio (RR) and 95% confidence intervals (CIs) associated with a 5 kg/m(2) increase in BMI. RESULTS: We analysed 29 datasets from 28 articles, including 67,361 incident cases. Higher BMI was associated with colon (RR 1.24, 95% CIs: 1.20-1.28) and rectal (1.09, 1.05-1.14) cancers in men, and with colon cancer (1.09, 1.04-1.12) in women. Associations were stronger in men than in women for colon (P < 0.001) and rectal (P = 0.005) cancers. Associations were generally consistent across geographic populations. Study characteristics and adjustments accounted for only moderate variations of associations. CONCLUSION: Increasing BMI is associated with a modest increased risk of developing colon and rectal cancers, but this modest risk may translate to large attributable proportions in high-prevalence obese populations. Inter-gender differences point to potentially important mechanistic differences, which merit further research.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

PURPOSE To explore whether population-related pharmacogenomics contribute to differences in patient outcomes between clinical trials performed in Japan and the United States, given similar study designs, eligibility criteria, staging, and treatment regimens. METHODS We prospectively designed and conducted three phase III trials (Four-Arm Cooperative Study, LC00-03, and S0003) in advanced-stage, non-small-cell lung cancer, each with a common arm of paclitaxel plus carboplatin. Genomic DNA was collected from patients in LC00-03 and S0003 who received paclitaxel (225 mg/m(2)) and carboplatin (area under the concentration-time curve, 6). Genotypic variants of CYP3A4, CYP3A5, CYP2C8, NR1I2-206, ABCB1, ERCC1, and ERCC2 were analyzed by pyrosequencing or by PCR restriction fragment length polymorphism. Results were assessed by Cox model for survival and by logistic regression for response and toxicity. Results Clinical results were similar in the two Japanese trials, and were significantly different from the US trial, for survival, neutropenia, febrile neutropenia, and anemia. There was a significant difference between Japanese and US patients in genotypic distribution for CYP3A4*1B (P = .01), CYP3A5*3C (P = .03), ERCC1 118 (P < .0001), ERCC2 K751Q (P < .001), and CYP2C8 R139K (P = .01). Genotypic associations were observed between CYP3A4*1B for progression-free survival (hazard ratio [HR], 0.36; 95% CI, 0.14 to 0.94; P = .04) and ERCC2 K751Q for response (HR, 0.33; 95% CI, 0.13 to 0.83; P = .02). For grade 4 neutropenia, the HR for ABCB1 3425C-->T was 1.84 (95% CI, 0.77 to 4.48; P = .19). CONCLUSION Differences in allelic distribution for genes involved in paclitaxel disposition or DNA repair were observed between Japanese and US patients. In an exploratory analysis, genotype-related associations with patient outcomes were observed for CYP3A4*1B and ERCC2 K751Q. This common-arm approach facilitates the prospective study of population-related pharmacogenomics in which ethnic differences in antineoplastic drug disposition are anticipated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND In many resource-limited settings monitoring of combination antiretroviral therapy (cART) is based on the current CD4 count, with limited access to HIV RNA tests or laboratory diagnostics. We examined whether the CD4 count slope over 6 months could provide additional prognostic information. METHODS We analyzed data from a large multicohort study in South Africa, where HIV RNA is routinely monitored. Adult HIV-positive patients initiating cART between 2003 and 2010 were included. Mortality was analyzed in Cox models; CD4 count slope by HIV RNA level was assessed using linear mixed models. RESULTS About 44,829 patients (median age: 35 years, 58% female, median CD4 count at cART initiation: 116 cells/mm) were followed up for a median of 1.9 years, with 3706 deaths. Mean CD4 count slopes per week ranged from 1.4 [95% confidence interval (CI): 1.2 to 1.6] cells per cubic millimeter when HIV RNA was <400 copies per milliliter to -0.32 (95% CI: -0.47 to -0.18) cells per cubic millimeter with >100,000 copies per milliliter. The association of CD4 slope with mortality depended on current CD4 count: the adjusted hazard ratio (aHRs) comparing a >25% increase over 6 months with a >25% decrease was 0.68 (95% CI: 0.58 to 0.79) at <100 cells per cubic millimeter but 1.11 (95% CI: 0.78 to 1.58) at 201-350 cells per cubic millimeter. In contrast, the aHR for current CD4 count, comparing >350 with <100 cells per cubic millimeter, was 0.10 (95% CI: 0.05 to 0.20). CONCLUSIONS Absolute CD4 count remains a strong risk for mortality with a stable effect size over the first 4 years of cART. However, CD4 count slope and HIV RNA provide independently added to the model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVES Zidovudine (ZDV) is recommended for first-line antiretroviral therapy (ART) in resource-limited settings. ZDV may, however, lead to anemia and impaired immunological response. We compared CD4+ cell counts over 5 years between patients starting ART with and without ZDV in southern Africa. DESIGN Cohort study. METHODS Patients aged at least 16 years who started first-line ART in South Africa, Botswana, Zambia, or Lesotho were included. We used linear mixed-effect models to compare CD4+ cell count trajectories between patients on ZDV-containing regimens and patients on other regimens, censoring follow-up at first treatment change. Impaired immunological recovery, defined as a CD4+ cell count below 100 cells/μl at 1 year, was assessed in logistic regression. Analyses were adjusted for baseline CD4+ cell count and hemoglobin level, age, sex, type of regimen, viral load monitoring, and calendar year. RESULTS A total of 72,597 patients starting ART, including 19,758 (27.2%) on ZDV, were analyzed. Patients on ZDV had higher CD4+ cell counts (150 vs.128 cells/μl) and hemoglobin level (12.0 vs. 11.0 g/dl) at baseline, and were less likely to be women than those on other regimens. Adjusted differences in CD4+ cell counts between regimens containing and not containing ZDV were -16 cells/μl [95% confidence interval (CI) -18 to -14] at 1 year and -56 cells/μl (95% CI -59 to -52) at 5 years. Impaired immunological recovery was more likely with ZDV compared to other regimens (odds ratio 1.40, 95% CI 1.22-1.61). CONCLUSION In southern Africa, ZDV is associated with inferior immunological recovery compared to other backbones. Replacing ZDV with another nucleoside reverse transcriptase inhibitor could avoid unnecessary switches to second-line ART.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND There is ongoing debate on the optimal drug-eluting stent (DES) in diabetic patients with coronary artery disease. Biodegradable polymer drug-eluting stents (BP-DES) may potentially improve clinical outcomes in these high-risk patients. We sought to compare long-term outcomes in patients with diabetes treated with biodegradable polymer DES vs. durable polymer sirolimus-eluting stents (SES). METHODS We pooled individual patient-level data from 3 randomized clinical trials (ISAR-TEST 3, ISAR-TEST 4 and LEADERS) comparing biodegradable polymer DES with durable polymer SES. Clinical outcomes out to 4years were assessed. The primary end point was the composite of cardiac death, myocardial infarction and target-lesion revascularization. Secondary end points were target lesion revascularization and definite or probable stent thrombosis. RESULTS Of 1094 patients with diabetes included in the present analysis, 657 received biodegradable polymer DES and 437 durable polymer SES. At 4years, the incidence of the primary end point was similar with BP-DES versus SES (hazard ratio=0.95, 95% CI=0.74-1.21, P=0.67). Target lesion revascularization was also comparable between the groups (hazard ratio=0.89, 95% CI=0.65-1.22, P=0.47). Definite or probable stent thrombosis was significantly reduced among patients treated with BP-DES (hazard ratio=0.52, 95% CI=0.28-0.96, P=0.04), a difference driven by significantly lower stent thrombosis rates with BP-DES between 1 and 4years (hazard ratio=0.15, 95% CI=0.03-0.70, P=0.02). CONCLUSIONS In patients with diabetes, biodegradable polymer DES, compared to durable polymer SES, were associated with comparable overall clinical outcomes during follow-up to 4years. Rates of stent thrombosis were significantly lower with BP-DES.