89 resultados para MDT 24 months


Relevância:

80.00% 80.00%

Publicador:

Resumo:

Background Studies of Malawi's option B+ programme for HIV-positive pregnant and breastfeeding women have reported high loss to follow-up during pregnancy and at the start of antiretroviral therapy (ART), but few data exist about retention during breastfeeding and after weaning. We examined loss to follow-up and retention in care in patients in the option B+ programme during their first 3 years on ART. Methods We analysed two data sources: aggregated facility-level data about patients in option B+ who started ART between Oct 1, 2011, and June 30, 2012, at 546 health facilities; and patient-level data from 20 large facilities with electronic medical record system for HIV-positive women who started ART between Sept 1, 2011, and Dec 31, 2013, under option B+ or because they had WHO clinical stages 3 or 4 disease or had CD4 counts of less than 350 cells per μL. We used facility-level data to calculate representative estimates of retention and loss to follow-up. We used patient-level data to study temporal trends in retention, timing of loss to follow-up, and predictors of no follow-up and loss to follow-up. We defined patients who were more than 60 days late for their first follow-up visit as having no follow-up and patients who were more than 60 days late for a subsequent visit as being lost to follow-up. We calculated proportions and cumulative probabilities of patients who had died, stopped ART, had no follow-up, were lost to follow-up, or were retained alive on ART for 36 months. We calculated odds ratios and hazard ratios to examine predictors of no follow-up and loss to follow-up. Findings Analysis of facility-level data about patients in option B+ who had not transferred to a different facility showed retention in care to be 76·8% (20 475 of 26 658 patients) after 12 months, 70·8% (18 306 of 25 849 patients) after 24 months, and 69·7% (17 787 of 25 535 patients) after 36 months. Patient-level data included 29 145 patients. 14 630 (50·2%) began treatment under option B+. Patients in option B+ had a higher risk of having no follow-up and, for the first 2 years of ART, higher risk of loss to follow-up than did patients who started ART because they had CD4 counts less than 350 cells per μL or WHO clinical stage 3 or 4 disease. Risk of loss to follow-up during the third year was low and similar for patients retained for 2 years. Retention rates did not change as the option B+ programme matured. Interpretation Our data suggest that pregnant and breastfeeding women who start ART immediately after they are diagnosed with HIV can be retained on ART through the option B+ programme, even after many have stopped breastfeeding. Interventions might be needed to improve retention in the first year on ART in option B+. Funding Bill & Melinda Gates Foundation, Partnerships for Enhanced Engagement in Research Health, and National Institute of Allergy and Infectious Diseases.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

INTRODUCTION: To describe the clinical outcomes of intravitreal ranibizumab treatment for subfoveal choroidal neovascularisation (CNV) associated with Stargardt disease. METHODS: Prospective, interventional, case series. All patients underwent intravitreal ranibizumab injections following a pro re nata regimen with monthly examination, over a 24-month follow-up. RESULTS: Three eyes were included in the study. Best corrected visual acuity changed from 0.47±0.06 (mean±SD) at baseline to 0.90±0.17 LogMAR at the end of the 24-month follow-up. Overall, a mean number of 11 ranibizumab injections were administered in 24months. Significant atrophic growth was detected in all cases, with the mean atrophy area increasing from 2.34±2.60 mm(2) (mean±SD) at baseline to 4.23±3.31 mm(2) at the end of the follow-up. CONCLUSIONS: Ranibizumab treatment can stop the CNV progression, but cannot ensure a significant visual improvement. Macular atrophy tends to significantly enlarge under ranibizumab treatment over the follow-up. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

STUDY DESIGN Single centre retrospective study of prospectively collected data, nested within the Eurospine Spine Tango data acquisition system. OBJECTIVE The aim of this study was to assess the patient-rated outcome and complication rates associated with lumbar fusion procedures in three different age groups. SUMMARY OF BACKGROUND DATA There is a general reluctance to consider spinal fusion procedures in elderly patients due to the increased likelihood of complications. METHODS Before and at 3, 12, and 24 months after surgery, patients completed the multidimensional Core Outcome Measures Index (COMI). At the 3-, 12-, and 24-month follow-ups they also rated the Global Treatment Outcome (GTO) and their satisfaction with care. Patients were divided into three age groups: younger (≥50y < 65y; n = 317), older (≥65y < 80y; n = 350), and geriatric (≥ 80y; n = 40). RESULTS 707 consecutive patients were included. The preoperative comorbidity status differed significantly (p < 0.0001) between the age groups, with the highest scores in the geriatric group. Medical complications during surgery were lower in the younger age group (7%) than in the older (13.4%; p = 0.006) and geriatric groups (17.5%; p = 0.007); surgical complications tended to be higher in the elderly group (younger, 6.3%; older, 6.0%; geriatric, 15.0%; p = 0.09). There were no significant group differences (p > 0.05) for the scores on any of the COMI domains, GTO, or patient-rated satisfaction at either 3-, 12-, and 24-months follow-up. CONCLUSIONS Despite greater comorbidity and complication rates in geriatric patients, the patient-rated outcome was as good in the elderly as it was in younger age groups up to two years after surgery. These data indicate that geriatric age needs careful consideration of associated risks but is not per se a contraindication for fusion for lumbar degenerative disease. LEVEL OF EVIDENCE 4.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND In recent years, the scientific discussion has focused on new strategies to enable a torn anterior cruciate ligament (ACL) to heal into mechanically stable scar tissue. Dynamic intraligamentary stabilization (DIS) was first performed in a pilot study of 10 patients. The purpose of the current study was to evaluate whether DIS would lead to similarly sufficient stability and good clinical function in a larger case series. METHODS Acute ACL ruptures were treated by using an internal stabilizer, combined with anatomical repositioning of torn bundles and microfracturing to promote self-healing. Clinical assessment (Tegner, Lysholm, IKDC, and visual analogue scale [VAS] for patient satisfaction scores) and assessment of knee laxity was performed at 3, 6, 12, and 24 months. A one-sample design with a non-inferiority margin was chosen to compare the preoperative and postoperative IKDS and Lysholm scores. RESULTS 278 patients with a 6:4 male to female ratio were included. Average patient age was 31 years. Preoperative mean IKDC, Lysholm, and Tegner scores were 98.8, 99.3, and 5.1 points, respectively. The mean anteroposterior (AP) translation difference from the healthy contralateral knee was 4.7 mm preoperatively. After DIS treatment, the mean 12-month IKDC, Lysholm, and Tegner scores were 93.6, 96.2, and 4.9 points, respectively, and the mean AP translation difference was 2.3 mm. All these outcomes were significantly non-inferior to the preoperative or healthy contralateral values (p < 0.0001). Mean patient satisfaction was 8.8 (VAS 0-10). Eight ACL reruptures occurred and 3 patients reported insufficient subjective stability of the knee at the end of the study period. CONCLUSIONS Anatomical repositioning, along with DIS and microfracturing, leads to clinically stable healing of the torn ACL in the large majority of patients. Most patients exhibited almost normal knee function, reported excellent satisfaction, and were able to return to their previous levels of sporting activity. Moreover, this strategy resulted in stable healing of all sutured menisci, which could lower the rate of osteoarthritic changes in future. The present findings support the discussion of a new paradigm in ACL treatment based on preservation and self-healing of the torn ligament.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE To compare patient outcomes and complication rates after different decompression techniques or instrumented fusion (IF) in lumbar spinal stenosis (LSS). METHODS The multicentre study was based on Spine Tango data. Inclusion criteria were LSS with a posterior decompression and pre- and postoperative COMI assessment between 3 and 24 months. 1,176 cases were assigned to four groups: (1) laminotomy (n = 642), (2) hemilaminectomy (n = 196), (3) laminectomy (n = 230) and (4) laminectomy combined with an IF (n = 108). Clinical outcomes were achievement of minimum relevant change in COMI back and leg pain and COMI score (2.2 points), surgical and general complications, measures taken due to complications, and reintervention on the index level based on patient information. The inverse propensity score weighting method was used for adjustment. RESULTS Laminotomy, hemilaminectomy and laminectomy were significantly less beneficial than laminectomy in combination with IF regarding leg pain (ORs with 95% CI 0.52, 0.34-0.81; 0.25, 0.15-0.41; 0.44, 0.27-0.72, respectively) and COMI score improvement (ORs with 95% CI 0.51, 0.33-0.81; 0.30, 0.18-0.51; 0.48, 0.29-0.79, respectively). However, the sole decompressions caused significantly fewer surgical (ORs with 95% CI 0.42, 0.26-0.69; 0.33, 0.17-0.63; 0.39, 0.21-0.71, respectively) and general complications (ORs with 95% CI 0.11, 0.04-0.29; 0.03, 0.003-0.41; 0.25, 0.09-0.71, respectively) than laminectomy in combination with IF. Accordingly, the likelihood of required measures was also significantly lower after laminotomy (OR 0.28, 95% CI 0.17-0.46), hemilaminectomy (OR 0.28, 95% CI 0.15-0.53) and after laminectomy (OR 0.39, 95% CI 0.22-0.68) in comparison with laminectomy with IF. The likelihood of a reintervention was not significantly different between the treatment groups. DISCUSSION As already demonstrated in the literature, decompression in patients with LSS is a very effective treatment. Despite better patient outcomes after laminectomy in combination with IF, caution is advised due to higher rates of surgical and general complications and consequent required measures. Based on the current study, laminotomy or laminectomy, rather than hemilaminectomy, is recommendable for minimum relevant pain relief.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Abstract. We resumed mowing in two plots of ca. 100 m2 in an abandoned meadow dominated by Brachypodium pinnatum on the slope of Monte Generoso (Switzerland). We monitored species composition and hay yield using point quadrats and biomass samples. Species frequencies changed little during 10 yr (1988–1997) while hay yields showed large fluctuations according to mean relative humidity in April-June. We performed a seed-addition experiment to test whether the establishment of meadow species is limited by lack of diaspores or favourable microsites for germination and recruitment from the seed bank. We sowed ca. 12 000 seeds of 12 species originating from a nearby meadow individually in plots of a 4 × 6 unbalanced Latin square with four treatments, burning, mowing, mowing and removal of a layer of decayed organic matter, and a control. We monitored the fate of seedling individuals for 24 months. Seedlings of all species were established and survived for 12 months, 10 species survived during at least 24 months, some reached a reproductive stage. Species responded to different qualities of microsites provided by the different treatments thus required different regeneration niches. Spontaneous long-distance immigration was insignificant. We conclude that the former species composition of abandoned meadows cannot easily be restored by mowing alone because many plant species of meadows do not have persistent seed banks and immigration over distances of more than 25 m and successful establishment is very unlikely.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

BACKGROUND Antiretroviral therapy (ART) initiation is now recommended irrespective of CD4 count. However data on the relationship between CD4 count at ART initiation and loss to follow-up (LTFU) are limited and conflicting. METHODS We conducted a cohort analysis including all adults initiating ART (2008-2012) at three public sector sites in South Africa. LTFU was defined as no visit in the 6 months before database closure. The Kaplan-Meier estimator and Cox's proportional hazards models examined the relationship between CD4 count at ART initiation and 24-month LTFU. Final models were adjusted for demographics, year of ART initiation, programme expansion and corrected for unascertained mortality. RESULTS Among 17 038 patients, the median CD4 at initiation increased from 119 (IQR 54-180) in 2008 to 257 (IQR 175-318) in 2012. In unadjusted models, observed LTFU was associated with both CD4 counts <100 cells/μL and CD4 counts ≥300 cells/μL. After adjustment, patients with CD4 counts ≥300 cells/μL were 1.35 (95% CI 1.12 to 1.63) times as likely to be LTFU after 24months compared to those with a CD4 150-199 cells/μL. This increased risk for patients with CD4 counts ≥300 cells/μL was largest in the first 3 months on treatment. Correction for unascertained deaths attenuated the association between CD4 counts <100 cells/μL and LTFU while the association between CD4 counts ≥300 cells/μL and LTFU persisted. CONCLUSIONS Patients initiating ART at higher CD4 counts may be at increased risk for LTFU. With programmes initiating patients at higher CD4 counts, models of ART delivery need to be reoriented to support long-term retention.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

PURPOSE To evaluate macular retinal ganglion cell thickness in patients with neovascular age-related macular degeneration (AMD) and intravitreal anti-vascular endothelial growth factor (VEGF) therapy. DESIGN Retrospective case series with fellow-eye comparison METHODS: Patients with continuous unilateral anti-VEGF treatment for sub- and juxtafoveal neovascular AMD and a minimum follow-up of 24 months were included. The retinal nerve fiber (RNFL) and retinal ganglion cell layer (RGCL) in the macula were segmented using an ETDRS grid. RNFL and RGCL thickness of the outer ring of the ETDRS grid were quantified at baseline and after repeated anti-VEGF injections, and compared to the patients' untreated fellow eye. Furthermore, best-corrected visual acuity (BCVA), age, and retinal pigment epithelium (RPE) atrophy were recorded and correlated with RNFL and RGCL. RESULTS Sixty eight eyes of 34 patients (23 female and 11 male; mean age 76.7 (SD±8.2) with a mean number of 31.5 (SD ±9.8) anti-VEGF injections and a mean follow-up period of 45.3 months (SD±10.5) were included. Whereas the RGCL thickness decreased significantly compared to the non-injected fellow eye (p=0.01) the decrease of the RNFL was not significant. Visual acuity gain was significantly correlated with RGCL thickness (r=0.52, p<0.05) at follow-up and negatively correlated (r=-0.41, p<0.05) with age. Presence of RPE atrophy correlated negatively with the RGCL thickness at follow-up (r= -0.37, p=0.03). CONCLUSION During the course of long term anti-VEGF therapy there is a significant decrease of the RGCL in patients with neovascular AMD to the fellow (untreated) eye.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

AIM To compare the survival rates of Class II Atraumatic Restorative Treatment (ART) restorations placed in primary molars using cotton rolls or rubber dam as isolation methods. METHODS A total of 232 children, 6-7 years old, both genders, were selected having one primary molar with proximal dentine lesion. The children were randomly assigned into two groups: control group with Class II ART restoration made using cotton rolls and experimental group using rubber dam. The restorations were evaluated by eight calibrated evaluators (Kappa > 0.8) after 6, 12, 18 and 24 months. RESULTS A total of 48 (20.7%) children were considered dropout, after 24 months. The cumulative survival rate after 6, 12, 18 and 24 months was 61.4%, 39.0%, 29.1% and 18.0%, respectively for the control group, and 64.1%, 55.1%, 40.1% and 32.1%, respectively for the rubber dam group. The log rank test for censored data showed no statistical significant difference between the groups (P = 0.07). The univariate Cox Regression showed no statistical significant difference after adjusting for independent variables (P > 0.05). CONCLUSION Both groups had similar survival rates, and after 2 years, the use of rubber dam does not increase the success of Class II ART restorations significantly.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: First to assess coagulation changes after surgery in children below 6 months of age. Second to detect differences attributable to the extent of surgery and postoperative infection. MATERIALS AND METHODS: Blood counts, haemoglobin concentration (Hb), haematocrit (Ht), prothrombine time (PT), activated partial thromboplastine time (aPTT) and thrombelastography (TEG) were studied pre- and 2+/-1/2 d postoperatively. Patients were divided in 3 groups. I: minor surgery without access to the abdomen or thorax (n=51); II: abdominal or thoracic interventions (n=24); III: abdominal surgery with postoperative sepsis (n=11). RESULTS: Preoperative values of Hb, Ht and INR were related to the age of the infant. Postoperatively clot strength and formation rate increased in gr. I (p<0.05). In gr. II, clot formation was initiated earlier (p<0.05) even though PT decreased (p<0.05). In group III, patients postoperatively developed a tendency for hypocoagulability in all TEG-parameters, but not in plasmatic coagulation. Postoperative TEG measurements were significantly inferior in gr. III when compared to gr. I and II. CONCLUSION: Our findings suggest activation of whole blood coagulation in the uncomplicated postoperative period despite of a decrease in plasmatic coagulation. In sepsis, only thrombelastography, but not plasmatic coagulation was affected.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Long-term follow-up examination to test whether therapy with mycophenolate mofetil (MMF) or azathioprine (AZA) during the first year translates into different graft or patient survival and graft function is important. Therefore, 6-year follow-up data of a group of 80 consecutive renal transplant recipients were analyzed. The first group of 40 patients was treated with AZA, cyclosporine and prednisone and the second group with MMF, cyclosporine and prednisone for the first 6 months. Graft failure rates were compared during follow-up. Creatinine, inverse slope of creatinine (delta/creatinine) and 24-hour proteinuria at 6 years post transplantation were compared. The Kaplan-Meier analyses for death-censored and non-censored graft failure showed no difference between the groups. Creatinine values at 6 years for the AZA Group were 139 +/- 36 micromol/l (95% CI 125.9-151.2 micromol/l) and for the MMF Group 149 +/- 52 micromol/l (95% CI 133.9-164.9 micromol/l). Delta/creatinine and 24-hour proteinuria at 6 years did not differ between the two groups. We conclude that an initial 6-month treatment with MMF as opposed to AZA reduced the early rejection rate, but did not result in superior long-term graft function or survival after 6 years of follow-up observation.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Abstract Purpose: There is evidence that depressed mood and perception of pain are related in patients with chronic illness. However, how individual resources such as self-efficacy and social support play a role in this association remains unclear. The aim of this study was to investigate the influence of both variables as either moderator or mediator. Method: In a longitudinal study, 274 injured workers (M = 43.24 years) were investigated. Data were collected on sociodemographics, depressed mood, pain, social support, and self-efficacy at three months post-injury, and depressed mood one year post-injury. Results: Hierarchical multiple linear regression analyses revealed that pain (β = 0.14; p < 0.01) and social support (β = -0.18; p < 0.001) were significant predictors of depressed mood. Self-efficacy moderated the relationship of pain (β = -0.12; p < 0.05) and depressed mood after one year. Lower self-efficacy in combination with pain had a stronger impact than higher self-efficacy and pain on depressed mood. Social support did not moderate the association. Conclusions: Self-efficacy for managing pain is important in the development of depressed mood. According to the results of this study, we suggest that the detection of low social support and low self-efficacy might be important in long-term rehabilitation process. Implications for Rehabilitation Risk for depressed mood one year after an accident is high: One in five workers report depressed mood. Protective factors for depressed mood in injured workers needs to be considered in the rehabilitation. Focusing on resources like social support and self-efficacy could be protective against depressed mood. The early detection of low social support and low self-efficacy might be important in long-term rehabilitation processes

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Post-traumatic sleep-wake disturbances are common after acute traumatic brain injury. Increased sleep need per 24 h and excessive daytime sleepiness are among the most prevalent post-traumatic sleep disorders and impair quality of life of trauma patients. Nevertheless, the relation between traumatic brain injury and sleep outcome, but also the link between post-traumatic sleep problems and clinical measures in the acute phase after traumatic brain injury has so far not been addressed in a controlled and prospective approach. We therefore performed a prospective controlled clinical study to examine (i) sleep-wake outcome after traumatic brain injury; and (ii) to screen for clinical and laboratory predictors of poor sleep-wake outcome after acute traumatic brain injury. Forty-two of 60 included patients with first-ever traumatic brain injury were available for follow-up examinations. Six months after trauma, the average sleep need per 24 h as assessed by actigraphy was markedly increased in patients as compared to controls (8.3 ± 1.1 h versus 7.1 ± 0.8 h, P < 0.0001). Objective daytime sleepiness was found in 57% of trauma patients and 19% of healthy subjects, and the average sleep latency in patients was reduced to 8.7 ± 4.6 min (12.1 ± 4.7 min in controls, P = 0.0009). Patients, but not controls, markedly underestimated both excessive sleep need and excessive daytime sleepiness when assessed only by subjective means, emphasizing the unreliability of self-assessment of increased sleep propensity in traumatic brain injury patients. At polysomnography, slow wave sleep after traumatic brain injury was more consolidated. The most important risk factor for developing increased sleep need after traumatic brain injury was the presence of an intracranial haemorrhage. In conclusion, we provide controlled and objective evidence for a direct relation between sleep-wake disturbances and traumatic brain injury, and for clinically significant underestimation of post-traumatic sleep-wake disturbances by trauma patients.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE This study is a prospective, controlled clinical and electrophysiologic trial examining the chronic course of posttraumatic sleep-wake disturbances (SWD). METHODS We screened 140 patients with acute, first-ever traumatic brain injury of any severity and included 60 patients for prospective follow-up examinations. Patients with prior brain trauma, other neurologic or systemic disease, drug abuse, or psychiatric comorbidities were excluded. Eighteen months after trauma, we performed detailed sleep assessment in 31 participants. As a control group, we enrolled healthy individuals without prior brain trauma matched for age, sex, and sleep satiation. RESULTS In the chronic state after traumatic brain injury, sleep need per 24 hours was persistently increased in trauma patients (8.1 ± 0.5 hours) as compared to healthy controls (7.1 ± 0.7 hours). The prevalence of chronic objective excessive daytime sleepiness was 67% in patients with brain trauma compared to 19% in controls. Patients significantly underestimated excessive daytime sleepiness and sleep need, emphasizing the unreliability of self-assessments on SWD in trauma patients. CONCLUSIONS This study provides prospective, controlled, and objective evidence for chronic persistence of posttraumatic SWD, which remain underestimated by patients. These results have clinical and medicolegal implications given that SWD can exacerbate other outcomes of traumatic brain injury, impair quality of life, and are associated with public safety hazards.