92 resultados para Outcomes of change readiness


Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIM To assess the clinical and radiographic outcomes applying a combined resective and regenerative approach in the treatment of peri-implantitis. MATERIALS AND METHODS Subjects with implants diagnosed with peri-implantitis (i.e., pocket probing depth (PPD) ≥5 mm with concomitant bleeding on probing (BoP) and ≥2 mm of marginal bone loss or exposure of ≥1 implant thread) were treated by means of a combined approach including the application of a deproteinized bovine bone mineral and a collagen membrane in the intrabony and implantoplasty in the suprabony component of the peri-implant lesion, respectively. The soft tissues were apically repositioned allowing for a non-submerged healing. Clinical and radiographic parameters were evaluated at baseline and 12 months after treatment. RESULTS Eleven subjects with 11 implants were treated and completed the 12-month follow-up. No implant was lost yielding a 100% survival rate. At baseline, the mean PPD and mean clinical attachment level (CAL) were 8.1 ± 1.8 mm and 9.7 ± 2.5 mm, respectively. After 1 year, a mean PPD of 4.0 ± 1.3 mm and a mean CAL of 6.7 ± 2.5 mm were assessed. The differences between the baseline and the follow-up examinations were statistically significant (P = 0.001). The mucosal recession increased from 1.7 ± 1.5 at baseline to 3.0 ± 1.8 mm at the 12-month follow-up (P = 0.003). The mean% of sites with BoP+ around the selected implants decreased from 19.7 ± 40.1 at baseline to 6.1 ± 24.0 after 12 months (P = 0.032). The radiographic marginal bone level decreased from 8.0 ± 3.7 mm at baseline to 5.2 ± 2.2 mm at the 12-month follow-up (P = 0.000001). The radiographic fill of the intrabony component of the defect amounted to 93.3 ± 13.0%. CONCLUSION Within the limits of this study, a combined regenerative and resective approach for the treatment of peri-implant defects yielded positive outcomes in terms of PPD reduction and radiographic defect fill after 12 months.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE The aim of the present prospective clinical study was to compare patient-reported outcomes for maxillary conventional dentures and maxillary implant-supported dentures. MATERIAL AND METHODS Twenty-one patients (6 women and 15 men) being edentulous in the maxilla and encountering problems with their existing dentures were included. Twelve patients (4 women and 8 men) received a new set of conventional dentures, due to insufficient dentures. In nine patients (2 women and 7 men), the existing dentures were adjusted by means of relining or rebasing. All patients received implant-supported dentures on two retentive anchors. In total, 42 implants were inserted in the anterior maxilla. The participants rated their satisfaction on their existing conventional dentures, 2 months after insertion of new conventional dentures and 2 months after insertion of implant-supported dentures. Thereby, patients responded to questionnaires capturing the oral health impact profile (OHIP) using visual analog scales. Seven domains (functional limitation, physical pain, psychological discomfort, physical, psychological and social disability and handicap) were assessed. Higher scores implied poorer patient satisfaction. In addition, the questionnaire involved the evaluation of cleaning ability, general satisfaction, speech, comfort, esthetics, stability, and chewing ability. Higher scores implied higher patient satisfaction. RESULTS Patient satisfaction significantly increased for implant-supported dentures compared with old dentures in all seven OHIP subgroups, as well as for cleaning ability, general satisfaction, ability to speak, comfort, esthetics, and stability (P < 0.05). The comparison of new conventional dentures and implant-supported dentures revealed a statistically significantly increased satisfaction for functional limitation (difference of 33.2 mm), psychological discomfort (difference of 36.7 mm), physical disability (difference of 36.3 mm), and social disability (difference of 23.5 mm), (P < 0.05). Additionally, general satisfaction, chewing ability, speech, and stability significantly improved in implant-supported dentures (P < 0.05). CONCLUSIONS Within the limits of this study, maxillary dentures retained by two implants provided some significant short-term improvements over conventional dentures in oral- and health-related quality of life.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES This study sought to describe the frequency and clinical impact of acute scaffold disruption and late strut discontinuity of the second-generation Absorb bioresorbable polymeric vascular scaffolds (Absorb BVS, Abbott Vascular, Santa Clara, California) in the ABSORB (A Clinical Evaluation of the Bioabsorbable Everolimus Eluting Coronary Stent System in the Treatment of Patients With De Novo Native Coronary Artery Lesions) cohort B study by optical coherence tomography (OCT) post-procedure and at 6, 12, 24, and 36 months. BACKGROUND Fully bioresorbable scaffolds are a novel approach to treatment for coronary narrowing that provides transient vessel support with drug delivery capability without the long-term limitations of metallic drug-eluting stents. However, a potential drawback of the bioresorbable scaffold is the potential for disruption of the strut network when overexpanded. Conversely, the structural discontinuity of the polymeric struts at a late stage is a biologically programmed fate of the scaffold during the course of bioresorption. METHODS The ABSORB cohort B trial is a multicenter single-arm trial assessing the safety and performance of the Absorb BVS in the treatment of 101 patients with de novo native coronary artery lesions. The current analysis included 51 patients with 143 OCT pullbacks who underwent OCT at baseline and follow-up. The presence of acute disruption or late discontinuities was diagnosed by the presence on OCT of stacked, overhung struts or isolated intraluminal struts disconnected from the expected circularity of the device. RESULTS Of 51 patients with OCT imaging post-procedure, acute scaffold disruption was observed in 2 patients (3.9%), which could be related to overexpansion of the scaffold at the time of implantation. One patient had a target lesion revascularization that was presumably related to the disruption. Of 49 patients without acute disruption, late discontinuities were observed in 21 patients. There were no major adverse cardiac events associated with this finding except for 1 patient who had a non-ischemia-driven target lesion revascularization. CONCLUSIONS Acute scaffold disruption is a rare iatrogenic phenomenon that has been anecdotally associated with anginal symptoms, whereas late strut discontinuity is observed in approximately 40% of patients and could be viewed as a serendipitous OCT finding of a normal bioresorption process without clinical implications. (ABSORB Clinical Investigation, Cohort B [ABSORB B]; NCT00856856).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

AIMS To investigate the outcomes of percutaneous coronary intervention (PCI) in bifurcation versus non-bifurcation lesions using the next-generation Resolute zotarolimus-eluting stent (R-ZES). METHODS AND RESULTS We analyzed 3-year pooled data from the RESOLUTE All-Comers trial and the RESOLUTE International registry. The R-ZES was used in 2772 non-bifurcation lesion patients and 703 bifurcation lesion patients, of which 482 were treated with a simple-stent technique (1 stent used to treat the bifurcation lesion) and 221 with a complex bifurcation technique (2 or more stents used). The primary endpoint was 3-year target lesion failure (TLF, defined as the composite of death from cardiac causes, target vessel myocardial infarction, or clinically-indicated target lesion revascularization [TLR]), and was 13.3% in bifurcation vs 11.3% in non-bifurcation lesion patients (adjusted P=.06). Landmark analysis revealed that this difference was driven by differences in the first 30 days between bifurcation vs non-bifurcation lesions (TLF, 6.6% vs 2.7%, respectively; adjusted P<.001), which included significant differences in each component of TLF and in-stent thrombosis. Between 31 days and 3 years, TLF, its components, and stent thrombosis did not differ significantly between bifurcation lesions and non-bifurcation lesions (TLF, 7.7% vs 9.0%, respectively; adjusted P=.50). CONCLUSION The 3-year risk of TLF following PCI with R-ZES in bifurcation lesions was not significantly different from non-bifurcation lesions. However, there was an increased risk associated with bifurcation lesions during the first 30 days; beyond 30 days, bifurcation lesions and non-bifurcation lesions yielded similar 3-year outcomes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background Concurrent cardiac diseases are frequent among elderly patients and invite simultaneous treatment to ensure an overall favourable patient outcome. Aim To investigate the feasibility of combined single-session percutaneous cardiac interventions in the era of transcatheter aortic valve implantation (TAVI). Methods This prospective, case–control study included 10 consecutive patients treated with TAVI, left atrial appendage occlusion and percutaneous coronary interventions. Some in addition had patent foramen ovale or atrial septal defect closure in the same session. The patients were matched in a 1:10 manner with TAVI-only cases treated within the same time period at the same institution regarding their baseline factors. The outcome was validated according to the Valve Academic Research Consortium (VARC) criteria. Results Procedural time (126±42 vs 83±40 min, p=0.0016), radiation time (34±8 vs 22±12 min, p=0.0001) and contrast dye (397±89 vs 250±105 mL, p<0.0001) were higher in the combined intervention group than in the TAVI-only group. Despite these drawbacks, no difference in the VARC endpoints was evident during the in-hospital period and after 30 days (VARC combined safety endpoint 32% for TAVI only and 20% for combined intervention, p=1.0). Conclusions Transcatheter treatment of combined cardiac diseases is feasible even in a single session in a high-volume centre with experienced operators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Rapid pulmonary vein (PV) activity has been shown to maintain paroxysmal atrial fibrillation (AF). We evaluated in persistent AF the cycle length (CL) gradient between PVs and the left atrium (LA) in an attempt to identify the subset of patients where PVs play an important role. METHODS AND RESULTS Ninety-seven consecutive patients undergoing first ablation for persistent AF were studied. For each PV, the CL of the fastest activation was assessed over 1 minute (PVfast) using Lasso recordings. The PV to LA CL gradient was quantified by the ratio of PVfast to LA appendage (LAA) AF CL. Stepwise ablation terminated AF in 73 patients (75%). In the AF termination group, the PVfast CL was much shorter than the LAA CL resulting in lower PVfast/LAA ratios compared with the nontermination group (71±10% versus 92±7%; P<0.001). Within the termination group, PVfast/LAA ratios were notably lower if AF terminated after PV isolation or limited adjunctive substrate ablation compared with patients who required moderate or extensive ablation (63±6% versus 75±8%; P<0.001). PVfast/LAA ratio <69% predicted AF termination after PV isolation or limited substrate ablation with 74% positive predictive value and 95% negative predictive value. After a mean follow-up of 29±17 months, freedom from arrhythmia recurrence off-antiarrhythmic drugs was achieved in most patients with PVfast/LAA ratios <69% as opposed to the remaining population (80% versus 43%; P<0.001). CONCLUSIONS The PV to LA CL gradient may identify the subset of patients in whom persistent AF is likely to terminate after PV isolation or limited substrate ablation and better long-term outcomes are achieved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

INTRODUCTION The incidence, treatment, and outcome of urethral recurrence (UR) after radical cystectomy (RC) for muscle-invasive bladder cancer with orthotopic neobladder in women have rarely been addressed in the literature. PATIENTS AND METHODS A total of 12 patients (median age at recurrence: 60 years) who experienced UR after RC with an orthotopic neobladder were selected for this study from a cohort of 456 women from participating institutions. The primary clinical and pathological characteristics at RC, including the manifestation of the UR and its treatment and outcome, were reviewed. RESULTS The primary bladder tumors in the 12 patients were urothelial carcinoma in 8 patients, squamous cell carcinoma and adenocarcinoma in 1 patient each, and mixed histology in 2 patients. Three patients (25%) had lymph node-positive disease at RC. The median time from RC to the detection of UR was 8 months (range 4-55). Eight recurrences manifested with clinical symptoms and 4 were detected during follow-up or during a diagnostic work-up for clinical symptoms caused by distant metastases. Treatment modalities were surgery, chemotherapy, radiotherapy, and bacillus Calmette-Guérin urethral instillations. Nine patients died of cancer. The median survival after the diagnosis of UR was 6 months. CONCLUSIONS UR after RC with an orthotopic neobladder in females is rare. Solitary, noninvasive recurrences have a favorable prognosis when detected early. Invasive recurrences are often associated with local and distant metastases and have a poor prognosis. © 2014 S. Karger AG, Basel.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND Up to 1 in 6 patients undergoing transcatheter aortic valve implantation (TAVI) present with low-ejection fraction, low-gradient (LEF-LG) severe aortic stenosis and concomitant relevant mitral regurgitation (MR) is present in 30% to 55% of these patients. The effect of MR on clinical outcomes of LEF-LG patients undergoing TAVI is unknown. METHODS AND RESULTS Of 606 consecutive patients undergoing TAVI, 113 (18.7%) patients with LEF-LG severe aortic stenosis (mean gradient ≤40 mm Hg, aortic valve area <1.0 cm(2), left ventricular ejection fraction <50%) were analyzed. LEF-LG patients were dichotomized into ≤mild MR (n=52) and ≥moderate MR (n=61). Primary end point was all-cause mortality at 1 year. No differences in mortality were observed at 30 days (P=0.76). At 1 year, LEF-LG patients with ≥moderate MR had an adjusted 3-fold higher rate of all-cause mortality (11.5% versus 38.1%; adjusted hazard ratio, 3.27 [95% confidence interval, 1.31-8.15]; P=0.011), as compared with LEF-LG patients with ≤mild MR. Mortality was mainly driven by cardiac death (adjusted hazard ratio, 4.62; P=0.005). As compared with LEF-LG patients with ≥moderate MR assigned to medical therapy, LEF-LG patients with ≥moderate MR undergoing TAVI had significantly lower all-cause mortality (hazard ratio, 0.38; 95% confidence interval, 0.019-0.75) at 1 year. CONCLUSIONS Moderate or severe MR is a strong independent predictor of late mortality in LEF-LG patients undergoing TAVI. However, LEF-LG patients assigned to medical therapy have a dismal prognosis independent of MR severity suggesting that TAVI should not be withheld from symptomatic patients with LEF-LG severe aortic stenosis even in the presence of moderate or severe MR.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Introduction: Current demographic changes are characterized by population aging, such that the surgical treatment of degenerative spine conditions in the elderly is gaining increasing relevance. However, there is a general reluctance to consider spinal fusion procedures in this patient age group due to the increased likelihood of complications. The aim of this study was to assess the patient-rated outcome and complication rates associated with lumbar fusion procedures in three different age groups. Methods: This was a retrospective analysis of prospectively collected data from consecutive patients who underwent first-time, one to three level posterior instrumented fusion between 2004 and 2011, due to degenerative disease of the lumbar spine. Data were obtained from our Spine Surgery Outcomes Database (linked to the International Spine Tango Register). Before surgery, patients completed the multidimensional Core Outcome Measures Index (COMI), and at 3 and 12 months after surgery they completed the COMI and rated the Global Treatment Outcome (GTO) and their satisfaction with care. Patients were divided into three groups according to their age: younger (≥50y <65y; n = 317), older (≥65y <80y; n = 350), and geriatric (≥ 80y; n = 40). Results: 707 consecutive patients were included. The preoperative comorbidity status differed significantly (p < 0.0001) between the age groups, with the highest scores in the geriatric group. General medical complications during surgery were lower in the younger age group (7%) than in the older (13.4%; p = 0.006) and geriatric groups (17.5%; p = 0.007). Duration of hospital stay was longer (p = 0.006) in the older group (10.8 ± 3.7 days) than the younger (10.0 ± 3.6 days) group. There were no significant group differences (p>0.05) for any of the COMI domains covering pain, function, symptom specific well-being, general quality of life, and social and work disability at either 3 months’ or 12 months’ follow-up. Similarly, there were no differences (p>0.05) between the age groups for GTO and patient-rated satisfaction at either follow-up. Conclusions: Preoperative comorbidity and general medical complications during lumbar fusion for degenerative disorders of the lumbar spine are both greater in geriatric patients than in younger patients. However, patient-rated outcome is as good in the elderly as it is in younger age groups. These data suggest that geriatric age per se is not a contraindication to instrumented fusion for lumbar degenerative disease.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVES Many paediatric antiretroviral therapy (ART) programmes in Southern Africa rely on CD4⁺ to monitor ART. We assessed the benefit of replacing CD4⁺ by viral load monitoring. DESIGN A mathematical modelling study. METHODS A simulation model of HIV progression over 5 years in children on ART, parameterized by data from seven South African cohorts. We simulated treatment programmes with 6-monthly CD4⁺ or 6- or 12-monthly viral load monitoring. We compared mortality, second-line ART use, immunological failure and time spent on failing ART. In further analyses, we varied the rate of virological failure, and assumed that the rate is higher with CD4⁺ than with viral load monitoring. RESULTS About 7% of children were predicted to die within 5 years, independent of the monitoring strategy. Compared with CD4⁺ monitoring, 12-monthly viral load monitoring reduced the 5-year risk of immunological failure from 1.6 to 1.0% and the mean time spent on failing ART from 6.6 to 3.6 months; 1% of children with CD4⁺ compared with 12% with viral load monitoring switched to second-line ART. Differences became larger when assuming higher rates of virological failure. When assuming higher virological failure rates with CD4⁺ than with viral load monitoring, up to 4.2% of children with CD4⁺ compared with 1.5% with viral load monitoring experienced immunological failure; the mean time spent on failing ART was 27.3 months with CD4⁺ monitoring and 6.0 months with viral load monitoring. Conclusion: Viral load monitoring did not affect 5-year mortality, but reduced time on failing ART, improved immunological response and increased switching to second-line ART.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND & AIMS Wilson disease is an autosomal recessive disorder that affects copper metabolism, leading to copper accumulation in liver, central nervous system, and kidneys. There are few data on long-term outcomes and survival from large cohorts; we studied these features in a well-characterized Austrian cohort of patients with Wilson disease. METHODS We analyzed data from 229 patients diagnosed with Wilson disease from 1961 through 2013; 175 regularly attended a Wilson disease outpatient clinic and/or their physicians were contacted for information on disease and treatment status and outcomes. For 53 patients lost during the follow-up period, those that died and reasons for their death were identified from the Austrian death registry. RESULTS The mean observation period was 14.8 ± 11.4 years (range, 0.5-52.0 years), resulting in 3116 patient-years. Of the patients, 61% presented with hepatic disease, 27% with neurologic symptoms, and 10% were diagnosed by family screening at presymptomatic stages. Patients with a hepatic presentation were diagnosed younger (21.2 ± 12.0 years) than patients with neurologic disease (28.8 ± 12.0; P < .001). In 2% of patients, neither symptoms nor onset of symptoms could be determined with certainty. Most patients stabilized (35%) or improved on chelation therapy (26% fully recovered, 24% improved), but 15% deteriorated; 8% required a liver transplant, and 7.4% died within the observation period (71% of deaths were related to Wilson disease). A lower proportion of patients with Wilson disease survived for 20 years (92%) than healthy Austrians (97%), adjusted for age and sex (P = .03). Cirrhosis at diagnosis was the best predictor of death (odds ratio, 6.8; 95% confidence interval, 1.5-31.03; P = .013) and need for a liver transplant (odds ratio, 07; 95% confidence interval, 0.016-0.307; P < .001). Only 84% of patients with cirrhosis survived 20 years after diagnosis (compared with healthy Austrians, P =.008). CONCLUSION Overall, patients who receive adequate care for Wilson disease have a good long-term prognosis. However, cirrhosis increases the risk of death and liver disease. Early diagnosis, at a precirrhotic stage, might increase survival times and reduce the need for a liver transplant.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

BACKGROUND New generation transcatheter heart valves (THV) may improve clinical outcomes of transcatheter aortic valve implantation. METHODS AND RESULTS In a nationwide, prospective, multicenter cohort study (Swiss Transcatheter Aortic Valve Implantation Registry, NCT01368250), outcomes of consecutive transfemoral transcatheter aortic valve implantation patients treated with the Sapien 3 THV (S3) versus the Sapien XT THV (XT) were investigated. An overall of 153 consecutive S3 patients were compared with 445 consecutive XT patients. Postprocedural mean transprosthetic gradient (6.5±3.0 versus 7.8±6.3 mm Hg, P=0.17) did not differ between S3 and XT patients, respectively. The rate of more than mild paravalvular regurgitation (1.3% versus 5.3%, P=0.04) and of vascular (5.3% versus 16.9%, P<0.01) complications were significantly lower in S3 patients. A higher rate of new permanent pacemaker implantations was observed in patients receiving the S3 valve (17.0% versus 11.0%, P=0.01). There were no significant differences for disabling stroke (S3 1.3% versus XT 3.1%, P=0.29) and all-cause mortality (S3 3.3% versus XT 4.5%, P=0.27). CONCLUSIONS The use of the new generation S3 balloon-expandable THV reduced the risk of more than mild paravalvular regurgitation and vascular complications but was associated with an increased permanent pacemaker rate compared with the XT. Transcatheter aortic valve implantation using the newest generation balloon-expandable THV is associated with a low risk of stroke and favorable clinical outcomes. CLINICAL TRIAL REGISTRATION URL: http://www.clinicaltrials.gov. Unique identifier: NCT01368250.