954 resultados para MDT 24 months
Resumo:
OBJECTIVES Pre-antiretroviral therapy (ART) inflammation and coagulation activation predict clinical outcomes in HIV-positive individuals. We assessed whether pre-ART inflammatory marker levels predicted the CD4 count response to ART. METHODS Analyses were based on data from the Strategic Management of Antiretroviral Therapy (SMART) trial, an international trial evaluating continuous vs. interrupted ART, and the Flexible Initial Retrovirus Suppressive Therapies (FIRST) trial, evaluating three first-line ART regimens with at least two drug classes. For this analysis, participants had to be ART-naïve or off ART at randomization and (re)starting ART and have C-reactive protein (CRP), interleukin-6 (IL-6) and D-dimer measured pre-ART. Using random effects linear models, we assessed the association between each of the biomarker levels, categorized as quartiles, and change in CD4 count from ART initiation to 24 months post-ART. Analyses adjusted for CD4 count at ART initiation (baseline), study arm, follow-up time and other known confounders. RESULTS Overall, 1084 individuals [659 from SMART (26% ART naïve) and 425 from FIRST] met the eligibility criteria, providing 8264 CD4 count measurements. Seventy-five per cent of individuals were male with the mean age of 42 years. The median (interquartile range) baseline CD4 counts were 416 (350-530) and 100 (22-300) cells/μL in SMART and FIRST, respectively. All of the biomarkers were inversely associated with baseline CD4 count in FIRST but not in SMART. In adjusted models, there was no clear relationship between changing biomarker levels and mean change in CD4 count post-ART (P for trend: CRP, P = 0.97; IL-6, P = 0.25; and D-dimer, P = 0.29). CONCLUSIONS Pre-ART inflammation and coagulation activation do not predict CD4 count response to ART and appear to influence the risk of clinical outcomes through other mechanisms than blunting long-term CD4 count gain.
Resumo:
OBJECTIVE
To evaluate the long term oncological and functional outcomes after readaptation of the dorsolateral peritoneal layer following pelvic lymph node dissection (PLND) and cystectomy .
PATIENTS AND METHODS
A randomised, single-center, single-blinded, two-arm trial was conducted on 200 consecutive cystectomy patients who underwent PLND and cystectomy for bladder cancer (
Resumo:
Denosumab reduced the incidence of new fractures in postmenopausal women with osteoporosis by 68% at the spine and 40% at the hip over 36 months compared with placebo in the FREEDOM study. This efficacy was supported by improvements from baseline in vertebral (18.2%) strength in axial compression and femoral (8.6%) strength in sideways fall configuration at 36 months, estimated in Newtons by an established voxel-based finite element (FE) methodology. Since FE analyses rely on the choice of meshes, material properties, and boundary conditions, the aim of this study was to independently confirm and compare the effects of denosumab on vertebral and femoral strength during the FREEDOM trial using an alternative smooth FE methodology. Unlike the previous FE study, effects on femoral strength in physiological stance configuration were also examined. QCT data for the proximal femur and two lumbar vertebrae were analyzed by smooth FE methodology at baseline, 12, 24, and 36 months for 51 treated (denosumab) and 47 control (placebo) subjects. QCT images were segmented and converted into smooth FE models to compute bone strength. L1 and L2 vertebral bodies were virtually loaded in axial compression and the proximal femora in both fall and stance configurations. Denosumab increased vertebral body strength by 10.8%, 14.0%, and 17.4% from baseline at 12, 24, and 36 months, respectively (p < 0.0001). Denosumab also increased femoral strength in the fall configuration by 4.3%, 5.1%, and 7.2% from baseline at 12, 24, and 36 months, respectively (p < 0.0001). Similar improvements were observed in the stance configuration with increases of 4.2%, 5.2%, and 5.2% from baseline (p ≤ 0.0007). Differences between the increasing strengths with denosumab and the decreasing strengths with placebo were significant starting at 12 months (vertebral and femoral fall) or 24 months (femoral stance). Using an alternative smooth FE methodology, we confirmed the significant improvements in vertebral body and proximal femur strength previously observed with denosumab. Estimated increases in strength with denosumab and decreases with placebo were highly consistent between both FE techniques.
Resumo:
AIMS The GLOBAL LEADERS trial is a superiority study in patients undergoing percutaneous coronary intervention, with a uniform use of Biolimus A9-eluting stents (BES) and bivalirudin. GLOBAL LEADERS was designed to assess whether a 24-month antithrombotic regimen with ticagrelor and one month of acetylsalicylic acid (ASA), compared to conventional dual antiplatelet therapy (DAPT), improves outcomes. METHODS AND RESULTS Patients (n >16,000) are randomised (1:1 ratio) to ticagrelor 90 mg twice daily for 24 months plus ASA ≤100 mg for one month versus DAPT with either ticagrelor (acute coronary syndrome) or clopidogrel (stable coronary artery disease) for 12 months plus ASA ≤100 mg for 24 months. The primary outcome is a composite of all-cause mortality or non-fatal, new Q-wave myocardial infarction at 24 months. The key safety endpoint is investigator-reported class 3 or 5 bleeding according to the Bleeding Academic Research Consortium (BARC) definitions. Sensitivity analysis will be carried out to explore potential differences in outcome across geographic regions and according to specific angiographic and clinical risk estimates. CONCLUSIONS The GLOBAL LEADERS trial aims to assess the role of ticagrelor as a single antiplatelet agent after a short course of DAPT for the long-term prevention of cardiac adverse events, across a wide spectrum of patients, following BES implantation.
Resumo:
BACKGROUND Buruli ulcer (BU) is a necrotizing skin disease most prevalent among West African children. The causative organism, Mycobacterium ulcerans, is sensitive to temperatures above 37°C. We investigated the safety and efficacy of a local heat application device based on phase change material. METHODS In a phase II open label single center noncomparative clinical trial (ISRCTN 72102977) under GCP standards in Cameroon, laboratory confirmed BU patients received up to 8 weeks of heat treatment. We assessed efficacy based on the endpoints 'absence of clinical BU specific features' or 'wound closure' within 6 months ("primary cure"), and 'absence of clinical recurrence within 24 month' ("definite cure"). RESULTS Of 53 patients 51 (96%) had ulcerative disease. 62% were classified as World Health Organization category II, 19% each as category I and III. The average lesion size was 45 cm(2). Within 6 months after completion of heat treatment 92.4% (49 of 53, 95% confidence interval [CI], 81.8% to 98.0%) achieved cure of their primary lesion. At 24 months follow-up 83.7% (41 of 49, 95% CI, 70.3% to 92.7%) of patients with primary cure remained free of recurrence. Heat treatment was well tolerated; adverse effects were occasional mild local skin reactions. CONCLUSIONS Local thermotherapy is a highly effective, simple, cheap and safe treatment for M. ulcerans disease. It has in particular potential as home-based remedy for BU suspicious lesions at community level where laboratory confirmation is not available. CLINICAL TRIALS REGISTRATION ISRCT 72102977.
Resumo:
The updated Vienna Prediction Model for estimating recurrence risk after an unprovoked venous thromboembolism (VTE) has been developed to identify individuals at low risk for VTE recurrence in whom anticoagulation (AC) therapy may be stopped after 3 months. We externally validated the accuracy of the model to predict recurrent VTE in a prospective multicenter cohort of 156 patients aged ≥65 years with acute symptomatic unprovoked VTE who had received 3 to 12 months of AC. Patients with a predicted 12-month risk within the lowest quartile based on the updated Vienna Prediction Model were classified as low risk. The risk of recurrent VTE did not differ between low- vs higher-risk patients at 12 months (13% vs 10%; P = .77) and 24 months (15% vs 17%; P = 1.0). The area under the receiver operating characteristic curve for predicting VTE recurrence was 0.39 (95% confidence interval [CI], 0.25-0.52) at 12 months and 0.43 (95% CI, 0.31-0.54) at 24 months. In conclusion, in elderly patients with unprovoked VTE who have stopped AC, the updated Vienna Prediction Model does not discriminate between patients who develop recurrent VTE and those who do not. This study was registered at www.clinicaltrials.gov as #NCT00973596.
Resumo:
BACKGROUND Viral load and CD4% are often not available in resource-limited settings for monitoring children's responses to antiretroviral therapy (ART). We aimed to construct normative curves for weight gain at 6, 12, 18, and 24 months following initiation of ART in children, and to assess the association between poor weight gain and subsequent responses to ART. DESIGN Analysis of data from HIV-infected children younger than 10 years old from African and Asian clinics participating in the International epidemiologic Databases to Evaluate AIDS. METHODS The generalized additive model for location, scale, and shape was used to construct normative percentile curves for weight gain at 6, 12, 18, and 24 months following ART initiation. Cox proportional models were used to assess the association between lower percentiles (< 50th) of weight gain distribution at the different time points and subsequent death, virological suppression, and virological failure. RESULTS Among 7173 children from five regions of the world, 45% were underweight at baseline. Weight gain below the 50th percentile at 6, 12, 18, and 24 months of ART was associated with increased risk of death, independent of baseline characteristics. Poor weight gain was not associated with increased hazards of virological suppression or virological failure. CONCLUSION Monitoring weight gain on ART using age-specific and sex-specific normative curves specifically developed for HIV-infected children on ART is a simple, rapid, sustainable tool that can aid in the identification of children who are at increased risk of death in the first year of ART.
Resumo:
OBJECTIVE Vertebroplasty and balloon kyphoplasty are effective treatment options for osteoporotic vertebral compression fractures but are limited in correction of kyphotic deformity. Lordoplasty has been reported as an alternative, cost-effective, minimally invasive, percutaneous cement augmentation technique with good restoration of vertebral body height and alignment. The authors report on its clinical and radiological midterm results. METHODS A retrospective review was conducted of patients treated with lordoplasty from 2002 to 2014. Inclusion criteria were clinical and radiological follow-up evaluations longer than 24 months. Radiographs were accessed regarding initial correction and progressive loss of reduction. Complications and reoperations were recorded. Actual pain level, pain relief immediately after surgery, autonomy, and subjective impression of improvement of posture were assessed by questionnaire. RESULTS Sixty-five patients (46 women, 19 men, age range 38.9-86.2 years old) were treated with lordoplasty for 69 vertebral compression and insufficiency fractures. A significant correction of the vertebral kyphotic angle (mean 13°) and segmental kyphotic angle (mean 11°) over a mean follow-up of 33 months (range 24-108 months) was achieved (p < 0.001). On average, pain was relieved to 90% of the initial pain level. In 24% of the 65 patients a second spinal intervention was necessary: 16 distant (24.6%) and 7 adjacent (10.8%) new osteoporotic fractures, 4 instrumented stabilizations (6.2%), 1 new adjacent traumatic fracture (1.5%), and 1 distant microsurgical decompression (1.5%). Cement leakage occurred in 10.4% but was only symptomatic in 1 case. CONCLUSIONS Lordoplasty appeared safe and effective in midterm pain alleviation and restoration of kyphotic deformity in osteoporotic compression and insufficiency fractures. The outcomes of lordoplasty are consistent with other augmentation techniques.
Resumo:
OBJECTIVE In Europe, growth hormone (GH) treatment for children born small for gestational age (SGA) can only be initiated after 4 years of age. However, younger age at treatment initiation is a predictor of favourable response. To assess the effect of GH treatment on early growth and cognitive functioning in very young (<30 months), short-stature children born SGA. DESIGN A 2-year, randomized controlled, multicentre study (NCT00627523; EGN study), in which patients received either GH treatment or no treatment for 24 months. PATIENTS Children aged 19-29 months diagnosed as SGA at birth, and for whom sufficient early growth data were available, were eligible. Patients were randomized (1:1) to GH treatment (Genotropin(®) , Pfizer Inc.) at a dose of 0·035 mg/kg/day by subcutaneous injection, or no treatment. MEASUREMENTS The primary objective was to assess the change from baseline in height standard deviation score (SDS) after 24 months of GH treatment. RESULTS Change from baseline in height SDS was significantly greater in the GH treatment vs control group at both month 12 (1·03 vs 0·14) and month 24 (1·63 vs 0·43; both P < 0·001). Growth velocity SDS was significantly higher in the GH treatment vs control group at 12 months (P < 0·001), but not at 24 months. There was no significant difference in mental or psychomotor development indices between the two groups. CONCLUSIONS GH treatment for 24 months in very young short-stature children born SGA resulted in a significant increase in height SDS compared with no treatment.
Resumo:
Background Studies of Malawi's option B+ programme for HIV-positive pregnant and breastfeeding women have reported high loss to follow-up during pregnancy and at the start of antiretroviral therapy (ART), but few data exist about retention during breastfeeding and after weaning. We examined loss to follow-up and retention in care in patients in the option B+ programme during their first 3 years on ART. Methods We analysed two data sources: aggregated facility-level data about patients in option B+ who started ART between Oct 1, 2011, and June 30, 2012, at 546 health facilities; and patient-level data from 20 large facilities with electronic medical record system for HIV-positive women who started ART between Sept 1, 2011, and Dec 31, 2013, under option B+ or because they had WHO clinical stages 3 or 4 disease or had CD4 counts of less than 350 cells per μL. We used facility-level data to calculate representative estimates of retention and loss to follow-up. We used patient-level data to study temporal trends in retention, timing of loss to follow-up, and predictors of no follow-up and loss to follow-up. We defined patients who were more than 60 days late for their first follow-up visit as having no follow-up and patients who were more than 60 days late for a subsequent visit as being lost to follow-up. We calculated proportions and cumulative probabilities of patients who had died, stopped ART, had no follow-up, were lost to follow-up, or were retained alive on ART for 36 months. We calculated odds ratios and hazard ratios to examine predictors of no follow-up and loss to follow-up. Findings Analysis of facility-level data about patients in option B+ who had not transferred to a different facility showed retention in care to be 76·8% (20 475 of 26 658 patients) after 12 months, 70·8% (18 306 of 25 849 patients) after 24 months, and 69·7% (17 787 of 25 535 patients) after 36 months. Patient-level data included 29 145 patients. 14 630 (50·2%) began treatment under option B+. Patients in option B+ had a higher risk of having no follow-up and, for the first 2 years of ART, higher risk of loss to follow-up than did patients who started ART because they had CD4 counts less than 350 cells per μL or WHO clinical stage 3 or 4 disease. Risk of loss to follow-up during the third year was low and similar for patients retained for 2 years. Retention rates did not change as the option B+ programme matured. Interpretation Our data suggest that pregnant and breastfeeding women who start ART immediately after they are diagnosed with HIV can be retained on ART through the option B+ programme, even after many have stopped breastfeeding. Interventions might be needed to improve retention in the first year on ART in option B+. Funding Bill & Melinda Gates Foundation, Partnerships for Enhanced Engagement in Research Health, and National Institute of Allergy and Infectious Diseases.
Resumo:
INTRODUCTION: To describe the clinical outcomes of intravitreal ranibizumab treatment for subfoveal choroidal neovascularisation (CNV) associated with Stargardt disease. METHODS: Prospective, interventional, case series. All patients underwent intravitreal ranibizumab injections following a pro re nata regimen with monthly examination, over a 24-month follow-up. RESULTS: Three eyes were included in the study. Best corrected visual acuity changed from 0.47±0.06 (mean±SD) at baseline to 0.90±0.17 LogMAR at the end of the 24-month follow-up. Overall, a mean number of 11 ranibizumab injections were administered in 24 months. Significant atrophic growth was detected in all cases, with the mean atrophy area increasing from 2.34±2.60 mm(2) (mean±SD) at baseline to 4.23±3.31 mm(2) at the end of the follow-up. CONCLUSIONS: Ranibizumab treatment can stop the CNV progression, but cannot ensure a significant visual improvement. Macular atrophy tends to significantly enlarge under ranibizumab treatment over the follow-up. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Resumo:
STUDY DESIGN Single centre retrospective study of prospectively collected data, nested within the Eurospine Spine Tango data acquisition system. OBJECTIVE The aim of this study was to assess the patient-rated outcome and complication rates associated with lumbar fusion procedures in three different age groups. SUMMARY OF BACKGROUND DATA There is a general reluctance to consider spinal fusion procedures in elderly patients due to the increased likelihood of complications. METHODS Before and at 3, 12, and 24 months after surgery, patients completed the multidimensional Core Outcome Measures Index (COMI). At the 3-, 12-, and 24-month follow-ups they also rated the Global Treatment Outcome (GTO) and their satisfaction with care. Patients were divided into three age groups: younger (≥50y < 65y; n = 317), older (≥65y < 80y; n = 350), and geriatric (≥ 80y; n = 40). RESULTS 707 consecutive patients were included. The preoperative comorbidity status differed significantly (p < 0.0001) between the age groups, with the highest scores in the geriatric group. Medical complications during surgery were lower in the younger age group (7%) than in the older (13.4%; p = 0.006) and geriatric groups (17.5%; p = 0.007); surgical complications tended to be higher in the elderly group (younger, 6.3%; older, 6.0%; geriatric, 15.0%; p = 0.09). There were no significant group differences (p > 0.05) for the scores on any of the COMI domains, GTO, or patient-rated satisfaction at either 3-, 12-, and 24-months follow-up. CONCLUSIONS Despite greater comorbidity and complication rates in geriatric patients, the patient-rated outcome was as good in the elderly as it was in younger age groups up to two years after surgery. These data indicate that geriatric age needs careful consideration of associated risks but is not per se a contraindication for fusion for lumbar degenerative disease. LEVEL OF EVIDENCE 4.
Resumo:
BACKGROUND In recent years, the scientific discussion has focused on new strategies to enable a torn anterior cruciate ligament (ACL) to heal into mechanically stable scar tissue. Dynamic intraligamentary stabilization (DIS) was first performed in a pilot study of 10 patients. The purpose of the current study was to evaluate whether DIS would lead to similarly sufficient stability and good clinical function in a larger case series. METHODS Acute ACL ruptures were treated by using an internal stabilizer, combined with anatomical repositioning of torn bundles and microfracturing to promote self-healing. Clinical assessment (Tegner, Lysholm, IKDC, and visual analogue scale [VAS] for patient satisfaction scores) and assessment of knee laxity was performed at 3, 6, 12, and 24 months. A one-sample design with a non-inferiority margin was chosen to compare the preoperative and postoperative IKDS and Lysholm scores. RESULTS 278 patients with a 6:4 male to female ratio were included. Average patient age was 31 years. Preoperative mean IKDC, Lysholm, and Tegner scores were 98.8, 99.3, and 5.1 points, respectively. The mean anteroposterior (AP) translation difference from the healthy contralateral knee was 4.7 mm preoperatively. After DIS treatment, the mean 12-month IKDC, Lysholm, and Tegner scores were 93.6, 96.2, and 4.9 points, respectively, and the mean AP translation difference was 2.3 mm. All these outcomes were significantly non-inferior to the preoperative or healthy contralateral values (p < 0.0001). Mean patient satisfaction was 8.8 (VAS 0-10). Eight ACL reruptures occurred and 3 patients reported insufficient subjective stability of the knee at the end of the study period. CONCLUSIONS Anatomical repositioning, along with DIS and microfracturing, leads to clinically stable healing of the torn ACL in the large majority of patients. Most patients exhibited almost normal knee function, reported excellent satisfaction, and were able to return to their previous levels of sporting activity. Moreover, this strategy resulted in stable healing of all sutured menisci, which could lower the rate of osteoarthritic changes in future. The present findings support the discussion of a new paradigm in ACL treatment based on preservation and self-healing of the torn ligament.
Resumo:
PURPOSE To compare patient outcomes and complication rates after different decompression techniques or instrumented fusion (IF) in lumbar spinal stenosis (LSS). METHODS The multicentre study was based on Spine Tango data. Inclusion criteria were LSS with a posterior decompression and pre- and postoperative COMI assessment between 3 and 24 months. 1,176 cases were assigned to four groups: (1) laminotomy (n = 642), (2) hemilaminectomy (n = 196), (3) laminectomy (n = 230) and (4) laminectomy combined with an IF (n = 108). Clinical outcomes were achievement of minimum relevant change in COMI back and leg pain and COMI score (2.2 points), surgical and general complications, measures taken due to complications, and reintervention on the index level based on patient information. The inverse propensity score weighting method was used for adjustment. RESULTS Laminotomy, hemilaminectomy and laminectomy were significantly less beneficial than laminectomy in combination with IF regarding leg pain (ORs with 95% CI 0.52, 0.34-0.81; 0.25, 0.15-0.41; 0.44, 0.27-0.72, respectively) and COMI score improvement (ORs with 95% CI 0.51, 0.33-0.81; 0.30, 0.18-0.51; 0.48, 0.29-0.79, respectively). However, the sole decompressions caused significantly fewer surgical (ORs with 95% CI 0.42, 0.26-0.69; 0.33, 0.17-0.63; 0.39, 0.21-0.71, respectively) and general complications (ORs with 95% CI 0.11, 0.04-0.29; 0.03, 0.003-0.41; 0.25, 0.09-0.71, respectively) than laminectomy in combination with IF. Accordingly, the likelihood of required measures was also significantly lower after laminotomy (OR 0.28, 95% CI 0.17-0.46), hemilaminectomy (OR 0.28, 95% CI 0.15-0.53) and after laminectomy (OR 0.39, 95% CI 0.22-0.68) in comparison with laminectomy with IF. The likelihood of a reintervention was not significantly different between the treatment groups. DISCUSSION As already demonstrated in the literature, decompression in patients with LSS is a very effective treatment. Despite better patient outcomes after laminectomy in combination with IF, caution is advised due to higher rates of surgical and general complications and consequent required measures. Based on the current study, laminotomy or laminectomy, rather than hemilaminectomy, is recommendable for minimum relevant pain relief.
Resumo:
Abstract. We resumed mowing in two plots of ca. 100 m2 in an abandoned meadow dominated by Brachypodium pinnatum on the slope of Monte Generoso (Switzerland). We monitored species composition and hay yield using point quadrats and biomass samples. Species frequencies changed little during 10 yr (1988–1997) while hay yields showed large fluctuations according to mean relative humidity in April-June. We performed a seed-addition experiment to test whether the establishment of meadow species is limited by lack of diaspores or favourable microsites for germination and recruitment from the seed bank. We sowed ca. 12 000 seeds of 12 species originating from a nearby meadow individually in plots of a 4 × 6 unbalanced Latin square with four treatments, burning, mowing, mowing and removal of a layer of decayed organic matter, and a control. We monitored the fate of seedling individuals for 24 months. Seedlings of all species were established and survived for 12 months, 10 species survived during at least 24 months, some reached a reproductive stage. Species responded to different qualities of microsites provided by the different treatments thus required different regeneration niches. Spontaneous long-distance immigration was insignificant. We conclude that the former species composition of abandoned meadows cannot easily be restored by mowing alone because many plant species of meadows do not have persistent seed banks and immigration over distances of more than 25 m and successful establishment is very unlikely.