891 resultados para Clinical outcomes|Brazil
Resumo:
BACKGROUND A single non-invasive gene expression profiling (GEP) test (AlloMap®) is often used to discriminate if a heart transplant recipient is at a low risk of acute cellular rejection at time of testing. In a randomized trial, use of the test (a GEP score from 0-40) has been shown to be non-inferior to a routine endomyocardial biopsy for surveillance after heart transplantation in selected low-risk patients with respect to clinical outcomes. Recently, it was suggested that the within-patient variability of consecutive GEP scores may be used to independently predict future clinical events; however, future studies were recommended. Here we performed an analysis of an independent patient population to determine the prognostic utility of within-patient variability of GEP scores in predicting future clinical events. METHODS We defined the GEP score variability as the standard deviation of four GEP scores collected ≥315 days post-transplantation. Of the 737 patients from the Cardiac Allograft Rejection Gene Expression Observational (CARGO) II trial, 36 were assigned to the composite event group (death, re-transplantation or graft failure ≥315 days post-transplantation and within 3 years of the final GEP test) and 55 were assigned to the control group (non-event patients). In this case-controlled study, the performance of GEP score variability to predict future events was evaluated by the area under the receiver operator characteristics curve (AUC ROC). The negative predictive values (NPV) and positive predictive values (PPV) including 95 % confidence intervals (CI) of GEP score variability were calculated. RESULTS The estimated prevalence of events was 17 %. Events occurred at a median of 391 (inter-quartile range 376) days after the final GEP test. The GEP variability AUC ROC for the prediction of a composite event was 0.72 (95 % CI 0.6-0.8). The NPV for GEP score variability of 0.6 was 97 % (95 % CI 91.4-100.0); the PPV for GEP score variability of 1.5 was 35.4 % (95 % CI 13.5-75.8). CONCLUSION In heart transplant recipients, a GEP score variability may be used to predict the probability that a composite event will occur within 3 years after the last GEP score. TRIAL REGISTRATION Clinicaltrials.gov identifier NCT00761787.
Resumo:
OBJECTIVE The aim of this study was to examine the prevalence of nutritional risk and its association with multiple adverse clinical outcomes in a large cohort of acutely ill medical inpatients from a Swiss tertiary care hospital. METHODS We prospectively followed consecutive adult medical inpatients for 30 d. Multivariate regression models were used to investigate the association of the initial Nutritional Risk Score (NRS 2002) with mortality, impairment in activities of daily living (Barthel Index <95 points), hospital length of stay, hospital readmission rates, and quality of life (QoL; adapted from EQ5 D); all parameters were measured at 30 d. RESULTS Of 3186 patients (mean age 71 y, 44.7% women), 887 (27.8%) were at risk for malnutrition with an NRS ≥3 points. We found strong associations (odds ratio/hazard ratio [OR/HR], 95% confidence interval [CI]) between nutritional risk and mortality (OR/HR, 7.82; 95% CI, 6.04-10.12), impaired Barthel Index (OR/HR, 2.56; 95% CI, 2.12-3.09), time to hospital discharge (OR/HR, 0.48; 95% CI, 0.43-0.52), hospital readmission (OR/HR, 1.46; 95% CI, 1.08-1.97), and all five dimensions of QoL measures. Associations remained significant after adjustment for sociodemographic characteristics, comorbidities, and medical diagnoses. Results were robust in subgroup analysis with evidence of effect modification (P for interaction < 0.05) based on age and main diagnosis groups. CONCLUSION Nutritional risk is significant in acutely ill medical inpatients and is associated with increased medical resource use, adverse clinical outcomes, and impairments in functional ability and QoL. Randomized trials are needed to evaluate evidence-based preventive and treatment strategies focusing on nutritional factors to improve outcomes in these high-risk patients.
Resumo:
PURPOSE To compare patient outcomes and complication rates after different decompression techniques or instrumented fusion (IF) in lumbar spinal stenosis (LSS). METHODS The multicentre study was based on Spine Tango data. Inclusion criteria were LSS with a posterior decompression and pre- and postoperative COMI assessment between 3 and 24 months. 1,176 cases were assigned to four groups: (1) laminotomy (n = 642), (2) hemilaminectomy (n = 196), (3) laminectomy (n = 230) and (4) laminectomy combined with an IF (n = 108). Clinical outcomes were achievement of minimum relevant change in COMI back and leg pain and COMI score (2.2 points), surgical and general complications, measures taken due to complications, and reintervention on the index level based on patient information. The inverse propensity score weighting method was used for adjustment. RESULTS Laminotomy, hemilaminectomy and laminectomy were significantly less beneficial than laminectomy in combination with IF regarding leg pain (ORs with 95% CI 0.52, 0.34-0.81; 0.25, 0.15-0.41; 0.44, 0.27-0.72, respectively) and COMI score improvement (ORs with 95% CI 0.51, 0.33-0.81; 0.30, 0.18-0.51; 0.48, 0.29-0.79, respectively). However, the sole decompressions caused significantly fewer surgical (ORs with 95% CI 0.42, 0.26-0.69; 0.33, 0.17-0.63; 0.39, 0.21-0.71, respectively) and general complications (ORs with 95% CI 0.11, 0.04-0.29; 0.03, 0.003-0.41; 0.25, 0.09-0.71, respectively) than laminectomy in combination with IF. Accordingly, the likelihood of required measures was also significantly lower after laminotomy (OR 0.28, 95% CI 0.17-0.46), hemilaminectomy (OR 0.28, 95% CI 0.15-0.53) and after laminectomy (OR 0.39, 95% CI 0.22-0.68) in comparison with laminectomy with IF. The likelihood of a reintervention was not significantly different between the treatment groups. DISCUSSION As already demonstrated in the literature, decompression in patients with LSS is a very effective treatment. Despite better patient outcomes after laminectomy in combination with IF, caution is advised due to higher rates of surgical and general complications and consequent required measures. Based on the current study, laminotomy or laminectomy, rather than hemilaminectomy, is recommendable for minimum relevant pain relief.
Resumo:
The risks associated with gestational diabetes (GD) can be reduced with an active treatment able to improve glycemic control. Advances in mobile health can provide new patient-centric models for GD to create personalized health care services, increase patient independence and improve patients’ self-management capabilities, and potentially improve their treatment compliance. In these models, decision-support functions play an essential role. The telemedicine system MobiGuide provides personalized medical decision support for GD patients that is based on computerized clinical guidelines and adapted to a mobile environment. The patient’s access to the system is supported by a smartphone-based application that enhances the efficiency and ease of use of the system. We formalized the GD guideline into a computer-interpretable guideline (CIG). We identified several workflows that provide decision-support functionalities to patients and 4 types of personalized advice to be delivered through a mobile application at home, which is a preliminary step to providing decision-support tools in a telemedicine system: (1) therapy, to help patients to comply with medical prescriptions; (2) monitoring, to help patients to comply with monitoring instructions; (3) clinical assessment, to inform patients about their health conditions; and (4) upcoming events, to deal with patients’ personal context or special events. The whole process to specify patient-oriented decision support functionalities ensures that it is based on the knowledge contained in the GD clinical guideline and thus follows evidence-based recommendations but at the same time is patient-oriented, which could enhance clinical outcomes and patients’ acceptance of the whole system.
Resumo:
Objective: To know the impact of the Dynesys system on the functional outcomes in patients with spinal degenerative diseases. Summary of background data: Dynesys system has been proposed as an alternative to vertebral fusion for several spinal degenerative diseases. The fact that it has been used in people with different diagnosis criteria using different tools to measure clinical outcomes makes very difficult unifying the results available nowadays. Methods: The data base of Medlars Online International Literature (MEDLINE) via PubMed©, EMBASE©, and the Cochrane Library Plus were reviewed in search of all the studies published until November 2012 in which an operation with Dynesys in patients with spinal degenerative diseases and an evaluation of the results by an analysis of functional outcomes had taken place. No limits were used to article type, date of publication or language. Results: A total of 134 articles were found, 26 of which fulfilled the inclusion criteria after being assessed by two reviewers. All of them were case series, except for a multicenter randomized clinical trial (RCT) and a prospective case-control study. The selected articles made a total of 1507 cases. The most frequent diagnosis were lumbar spinal canal stenosis (LSCS), degenerative disc disease (DDD), degenerative spondylolisthesis (DS) and lumbar degenerative scoliosis (LDS). In cases of lumbar spinal canal stenosis Dynesys was associated to surgical decompression. Several tools to measure the functional disability and general health status were found. Oswestry Disability Index (ODI), the ODI Korean version (K-Odi), Prolo, Sf-36, Sf-12, Roland-Morris disability questionnaire (RMDQ), and the pain Visual Analogue Scale (VAS) were the most used. They showed positive results in all cases series reviewed. In most studies the ODI decreased about 25% (e.g. from a score of 85% to 60%). Better results when dynamic fusion was combined with nerve root decompression were found. Functional outcomes and leg pain scores with Dynesys were statistically non-inferior to posterolateral spinal fusion using autogenous bone. When Dynesys and decompression was compared with posterior interbody lumbar fixation (PLIF) and decompression, differences in ODI and VAS were not statistically significant. Conclusions: In patients with spinal degenerative diseases due to degenerative disc disorders, spinal canal stenosis and degenerative spondylolisthesis, surgery with Dynesys and decompression improves functional outcomes, decreases disability, and reduces back and leg pain. More studies are needed to conclude that dynamic stabilization is better than posterolateral and posterior interbody lumbar fusion. Studies comparing Dynesys with decompression against decompression alone should be done in order to isolate the effect of the dynamic stabilization.
Resumo:
Aim To explore relationships between sirolimus dosing, concentration and clinical outcomes. Methods Data were collected from 25 kidney transplant recipients (14 M/11 F), median 278 days after transplantation. Outcomes of interest were white blood cell (WBC) count, platelet (PLT) count, and haematocrit (HCT). A naive pooled data analysis was performed with outcomes dichotomized (Mann-Whitney U-tests). Results Several patients experienced at least one episode when WBC (n = 9), PLT (n = 12), or HCT (n = 21) fell below the lower limits of the normal range. WBC and HCT were significantly lower (P < 0.05) when sirolimus dose was greater than 10 mg day(-1), and sirolimus concentration greater than 12 mu g l(-1). No relationship was shown for PLT and dichotomized sirolimus dose or concentration. Conclusions Given this relationship between sirolimus concentration and effect, linked population pharmacokinetic-pharmacodynamic modelling using data from more renal transplant recipients should now be used to quantify the time course of these relationships to optimize dosing and minimize risk of these adverse outcomes.
Resumo:
Purpose - To assess clinical outcomes and subjective experience after bilateral implantation of a diffractive trifocal intraocular lens (IOL). Setting - Midland Eye Institute, Solihull, United Kingdom. Design - Cohort study. Methods - Patients had bilateral implantation of Finevision trifocal IOLs. Uncorrected distance visual acuity, corrected distance visual acuity (CDVA), and manifest refraction were measured 2 months postoperatively. Defocus curves were assessed under photopic and mesopic conditions over a range of +1.50 to -4.00 diopters (D) in 0.50 D steps. Contrast sensitivity function was assessed under photopic conditions. Halometry was used to measure the angular size of monocular and binocular photopic scotomas arising from a glare source. Patient satisfaction with uncorrected near vision was assessed using the Near Activity Visual Questionnaire (NAVQ). Results - The mean monocular CDVA was 0.08 logMAR ± 0.08 (SD) and the mean binocular CDVA, 0.06 ± 0.08 logMAR. Defocus curve testing showed an extended range of clear vision from +1.00 to -2.50 D defocus, with a significant difference in acuity between photopic conditions and mesopic conditions at -1.50 D defocus only. Photopic contrast sensitivity was significantly better binocularly than monocularly at all spatial frequencies. Halometry showed a glare scotoma of a mean size similar to that in previous studies of multifocal and accommodating IOLs; there were no subjective complaints of dysphotopsia. The mean NAVQ Rasch score for satisfaction with near vision was 15.9 ± 10.7 logits. Conclusions - The trifocal IOL implanted binocularly produced good distance visual acuity and near and intermediate visual function. Patients were very satisfied with their uncorrected near vision.
Resumo:
Background: Major Depressive Disorder (MDD) is among the most prevalent and disabling medical conditions worldwide. Identification of clinical and biological markers ("biomarkers") of treatment response could personalize clinical decisions and lead to better outcomes. This paper describes the aims, design, and methods of a discovery study of biomarkers in antidepressant treatment response, conducted by the Canadian Biomarker Integration Network in Depression (CAN-BIND). The CAN-BIND research program investigates and identifies biomarkers that help to predict outcomes in patients with MDD treated with antidepressant medication. The primary objective of this initial study (known as CAN-BIND-1) is to identify individual and integrated neuroimaging, electrophysiological, molecular, and clinical predictors of response to sequential antidepressant monotherapy and adjunctive therapy in MDD. Methods: CAN-BIND-1 is a multisite initiative involving 6 academic health centres working collaboratively with other universities and research centres. In the 16-week protocol, patients with MDD are treated with a first-line antidepressant (escitalopram 10-20 mg/d) that, if clinically warranted after eight weeks, is augmented with an evidence-based, add-on medication (aripiprazole 2-10 mg/d). Comprehensive datasets are obtained using clinical rating scales; behavioural, dimensional, and functioning/quality of life measures; neurocognitive testing; genomic, genetic, and proteomic profiling from blood samples; combined structural and functional magnetic resonance imaging; and electroencephalography. De-identified data from all sites are aggregated within a secure neuroinformatics platform for data integration, management, storage, and analyses. Statistical analyses will include multivariate and machine-learning techniques to identify predictors, moderators, and mediators of treatment response. Discussion: From June 2013 to February 2015, a cohort of 134 participants (85 outpatients with MDD and 49 healthy participants) has been evaluated at baseline. The clinical characteristics of this cohort are similar to other studies of MDD. Recruitment at all sites is ongoing to a target sample of 290 participants. CAN-BIND will identify biomarkers of treatment response in MDD through extensive clinical, molecular, and imaging assessments, in order to improve treatment practice and clinical outcomes. It will also create an innovative, robust platform and database for future research. Trial registration: ClinicalTrials.gov identifier NCT01655706. Registered July 27, 2012.
Resumo:
The treatment of presbyopia has been the focus of much scientific and clinical research over recent years, not least due to an increasingly aging population but also the desire for spectacle independence. Many lens and nonlens-based approaches have been investigated, and with advances in biomaterials and improved surgical methods, removable corneal inlays have been developed. One such development is the KAMRA™ inlay where a small entrance pupil is exploited to create a pinhole-type effect that increases the depth of focus and enables improvement in near visual acuity. Short- and long-term clinical studies have all reported significant improvement in near and intermediate vision compared to preoperative measures following monocular implantation (nondominant eye), with a large proportion of patients achieving Jaeger (J) 2 to J1 (~0.00 logMAR to ~0.10 logMAR) at the final follow-up. Although distance acuity is reduced slightly in the treated eye, binocular visual acuity and function remain very good (mean 0.10 logMAR or better). The safety of the inlay is well established and easily removable, and although some patients have developed corneal changes, these are clinically insignificant and the incidence appears to reduce markedly with advancements in KAMRA design, implantation technique, and femtosecond laser technology. This review aims to summarize the currently published peer-reviewed studies on the safety and efficacy of the KAMRA inlay and discusses the surgical and clinical outcomes with respect to the patient’s visual function.
Resumo:
Background: Increased exposure to anticholinergic medication is problematic, particularly in those aged 80 years and older.
Objective: The aim of this systematic review was to identify tools used to quantify anticholinergic medication burden and determine the most appropriate tool for use in longitudinal research, conducted in those aged 80 years and older.
Methods: A systematic literature search was conducted across six electronic databases to identify existing tools. Data extraction was conducted independently by two researchers; studies describing the development of each tool were also retrieved and relevant data extracted. An assessment of quality was completed for all studies. Tools were assessed in terms of their measurement of the association between anticholinergic medication burden and a defined set of clinical outcomes, their development and their suitability for use in longitudinal research; the latter was evaluated on the basis of criteria defined as the key attributes of an ideal anticholinergic risk tool.
Results: In total, 807 papers were retrieved, 13 studies were eligible for inclusion and eight tools were identified. Included studies were classed as ‘very good’ or ‘good’ following the quality assessment analysis; one study was unclassified. Anticholinergic medication burden as measured in studies was associated with impaired cognitive and physical function, as well as an increased frequency of falls. The Drug Burden Index (DBI) exhibited most of the key attributes of an ideal anticholinergic risk tool.
Conclusion: This review identified the DBI as the most appropriate tool for use in longitudinal research focused on older people and their exposure to anticholinergic medication burden.
Resumo:
Hemizygous deletion of 17p (del(17p)) has been identified as a variable associated with poor prognosis in myeloma, although its impact in the context of thalidomide therapy is not well described. The clinical outcome of 85 myeloma patients with del(17p) treated in a clinical trial incorporating both conventional and thalidomide-based induction therapies was examined. The clinical impact of deletion, low expression, and mutation of TP53 was also determined. Patients with del(17p) did not have inferior response rates compared to patients without del(17p), but, despite this, del(17p) was associated with impaired overall survival (OS) (median OS 26.6 vs. 48.5 months, P <0.001). Within the del(17p) group, thalidomide induction therapy was associated with improved response rates compared to conventional therapy, but there was no impact on OS. Thalidomide maintenance was associated with impaired OS, although our analysis suggests that this effect may have been due to confounding variables. A minimally deleted region on 17p13.1 involving 17 genes was identified, of which only TP53 and SAT2 were underexpressed. TP53 was mutated in <1% in patients without del(17p) and in 27% of patients with del(17p). The higher TP53 mutation rate in samples with del(17p) suggests a role for TP53 in these clinical outcomes. In conclusion, del(17p) defined a patient group associated with short survival in myeloma, and although thalidomide induction therapy was associated with improved response rates, it did not impact OS, suggesting that alternative therapeutic strategies are required for this group. (C) 2011 Wiley-Liss, Inc.
Resumo:
AIMS: Device-based remote monitoring (RM) has been linked to improved clinical outcomes at short to medium-term follow-up. Whether this benefit extends to long-term follow-up is unknown. We sought to assess the effect of device-based RM on long-term clinical outcomes in recipients of implantable cardioverter-defibrillators (ICD). METHODS: We performed a retrospective cohort study of consecutive patients who underwent ICD implantation for primary prevention. RM was initiated with patient consent according to availability of RM hardware at implantation. Patients with concomitant cardiac resynchronization therapy were excluded. Data on hospitalizations, mortality and cause of death were systematically assessed using a nationwide healthcare platform. A Cox proportional hazards model was employed to estimate the effect of RM on mortality and a composite endpoint of cardiovascular mortality and hospital admission due to heart failure (HF). RESULTS: 312 patients were included with a median follow-up of 37.7months (range 1 to 146). 121 patients (38.2%) were under RM since the first outpatient visit post-ICD and 191 were in conventional follow-up. No differences were found regarding age, left ventricular ejection fraction, heart failure etiology or NYHA class at implantation. Patients under RM had higher long-term survival (hazard ratio [HR] 0.50, CI 0.27-0.93, p=0.029) and lower incidence of the composite outcome (HR 0.47, CI 0.27-0.82, p=0.008). After multivariate survival analysis, overall survival was independently associated with younger age, higher LVEF, NYHA class lower than 3 and RM. CONCLUSION: RM was independently associated with increased long-term survival and a lower incidence of a composite endpoint of hospitalization for HF or cardiovascular mortality.
Resumo:
Introduction: Intravenous thrombolysis in acute ischaemic stroke with alteplase improves clinical outcomes, but it has limited efficacy and is associated with increased risk of intracranial haemorrhage. An improved tissue plasminogen activator, tenecteplase, was evidenced to be at least equally effective with lower risk of haemorrhage in acute myocardial infarction thrombolysis. To date, two completed phase II randomised controlled studies comparing tenecteplase and alteplase in acute ischaemic strokes showed variable results. Methods: A literature review of thrombolytic agents used in myocardial infarction and acute ischaemic stroke was performed, followed by a retrospective investigation of the bolus-to- infusion delay of alteplase administration. The main focus of this thesis is the report of our single centre phase II randomised controlled trial that compared tenecteplase (0.25mg/kg, maximum 25mg) and alteplase (0.9mg/kg, maximum 90mg, 10% as the initial bolus, following by one hour infusion with the rest of the dose) in acute ischaemic stroke thrombolysis using advanced imaging as biomarkers. Imaging comprised baseline computed tomography (CT), CT perfusion (CTP) and CT angiography (CTA), and CT+CTA at 24-48 hours. The primary end-point was penumbral salvage (CTP-defined penumbra volume minus follow-up CT infarct volume). A sub-study of coagulation and fibrinolysis analysis of the two agents was performed by comparing a group of coagulation variables measured pre-treatment, 3-12 hours, and 24±3 hours post thrombolysis. An individual patient data (IPD) meta-analysis was carried out using all three completed tenecteplase/alteplase comparison studies in stroke thrombolysis. We compared clinical outcomes including modified Rankin scale at 3 months, early neurological improvement at 24 hours, intracerebral haemorrhage rate and mortality at 3 months between all three tenecteplase doses (0.1mg/kg, 0.25 mg/kg, and 0.4mg/kg) examined and standard alteplase. Imaging outcomes including penumbra salvage, recanalisation rates were also compared using the data from the two studies that had advance imaging carried out. Results: Delay between the initial bolus and the subsequent infusion in administration of alteplase is common. This may reduce the likelihood of achieving a good functional outcome. Among the 104 patients recruited in ATTEST trial, 71 contributed to the imaging primary outcome. No significant differences were observed for penumbral salvage [68 (SD 28) % tenecteplase vs 68 (SD 23) % alteplase], mean difference 1% (95% confidence interval -10%, 12%, p=0·81) or for any secondary end-point. The SICH incidence (1/52, 2% vs 2/51, 4%, by SITS-MOST definition, p=0·55; by ECASS-2 definition, 3/52, 6% tenecteplase vs 4/51, 8% alteplase, p=0.59) did not differed significantly. There was a trend towards lower ICH risk in the tenecteplase group (8/52 tenecteplase, 15% vs 14/51 alteplase, 29%, p=0·091). Compared to baseline, alteplase caused significant hypofibrinogenaemia (p=0.002), prolonged Prothrombin Time (PT) (p=0.011), hypoplasminogenaemia (p=0.001) and lower Factor V (p=0.002) at 3-12 hours after administration with persistent hypofibrinogenaemia at 24h (p=0.011), while only minor hypoplasminogenaemia (P=0.029) was seen in the tenecteplase group. Tenecteplase consumed less plasminogen (p<0.001) and fibrinogen (p=0.002) compared with alteplase. In a pooled analysis, tenecteplase 0.25mg/kg had the greatest odds to achieve early neurological improvement (OR [95%CI] 3.3 [1.5, 7.2], p=0.093), excellent functional outcome (mRS 0-1) at three months (OR [95%CI] 1.9 [0.8, 4.4], p= 0.28), with reduced odds of ICH (OR [95%CI] 0.6 [0.2, 1.8], P=0.43) compared with alteplase. Only 19 patients were treated with tenecteplase 0.4mg/kg, which showed increased odds of SICH compared with alteplase (OR [95% CI] 6.2 [0.7, 56.3]). In the two studies where advanced imaging was performed, the imaging outcomes did not differ in the IPD analysis. Conclusion: Tenecteplase 0.25 mg/kg has the potential to be a better alternative to alteplase. It can be given as a single bolus, does not cause disruption to systemic coagulation, and is possibly safer and more effective in clot lysis. Further phase III study to compare tenecteplase and alteplase in acute ischaemic stroke is warranted.
Resumo:
Introduction: Fluocinolone acetonide slow release implant (Iluvien®) was approved in December 2013 in UK for treatment of eyes which are pseudophakic with DMO that is unresponsive to other available therapies. This approval was based on evidence from FAME trials which were conducted at a time when ranibizumab was not available. There is a paucity of data on implementation of guidance on selecting patients for this treatment modality and also on the real world outcome of fluocinolone therapy especially in those patients that have been unresponsive to ranibizumab therapy. Method: Retrospective study of consecutive patients treated with fluocinolone between January and August 2014 at three sites were included to evaluate selection criteria used, baseline characteristics and clinical outcomes at 3-month time point. Results: Twenty two pseudophakic eyes of 22 consecutive patients were included. Majority of patients had prior therapy with multiple intravitreal anti-VEGF injections. Four eyes had controlled glaucoma. At baseline mean VA and CRT were 50.7 letters and 631 μm respectively. After 3 months, 18 patients had improved CRT of which 15 of them also had improved VA. No adverse effects were noted. One additional patient required IOP lowering medication. Despite being unresponsive to multiple prior therapies including laser and anti-VEGF injections, switching to fluocinolone achieved treatment benefit. Conclusion: The patient level selection criteria proposed by NICE guidance on fluocinolone appeared to be implemented. This data from this study provides new evidence on early outcomes following fluocinolone therapy in eyes with DMO which had not responded to laser and other intravitreal agents.
Resumo:
Excessive occlusal surface wear can result in occlusal disharmony, functional and esthetic impairment. As a therapeutic approach, conventional single crowns have been proposed, but this kind of treatment is complex, highly invasive and expensive. This case report describes the clinical outcomes of an alternative minimally invasive treatment based on direct adhesive-pin retained restorations. A 64-year-old woman with severely worn dentition, eating problems related to missing teeth and generalized tooth hypersensitivity was referred for treatment. Proper treatment planning based on the diagnostic wax-up simulation was used to guide the reconstruction of maxillary anterior teeth with direct composite resin over self-threading dentin pins. As the mandibular remaining teeth were extremely worn, a tooth-supported overdenture was installed. A stabilization splint was also used to protect the restorations. This treatment was a less expensive alternative to full-mouth rehabilitation with positive esthetic and functional outcomes after 1.5 years of follow-up.