881 resultados para Term-follow-up
Resumo:
Purpose: The aim of this trial was to evaluate telescopic-retained prostheses on teeth and implants. Materials and Methods: Ten patients with a mean of 2.8 teeth received strategic implants to achieve triangular/quadrangular support. Survival and complication rates were estimated for telescopic abutments and prostheses. Results: After a mean observation period of > 2 years, no abutment was lost and all prostheses were in function. Complication rates were low, and maintenance services were limited to minor interventions. Conclusions: Combined tooth-implant-retained telescopic prostheses improve prosthetic support and offer successful function over a midterm period in patients with a severely reduced dentition.
Resumo:
Loss to follow-up (LTFU) is a common problem in many epidemiological studies. In antiretroviral treatment (ART) programs for patients with human immunodeficiency virus (HIV), mortality estimates can be biased if the LTFU mechanism is non-ignorable, that is, mortality differs between lost and retained patients. In this setting, routine procedures for handling missing data may lead to biased estimates. To appropriately deal with non-ignorable LTFU, explicit modeling of the missing data mechanism is needed. This can be based on additional outcome ascertainment for a sample of patients LTFU, for example, through linkage to national registries or through survey-based methods. In this paper, we demonstrate how this additional information can be used to construct estimators based on inverse probability weights (IPW) or multiple imputation. We use simulations to contrast the performance of the proposed estimators with methods widely used in HIV cohort research for dealing with missing data. The practical implications of our approach are illustrated using South African ART data, which are partially linkable to South African national vital registration data. Our results demonstrate that while IPWs and proper imputation procedures can be easily constructed from additional outcome ascertainment to obtain valid overall estimates, neglecting non-ignorable LTFU can result in substantial bias. We believe the proposed estimators are readily applicable to a growing number of studies where LTFU is appreciable, but additional outcome data are available through linkage or surveys of patients LTFU. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
OBJECTIVES: Treatment as prevention depends on retaining HIV-infected patients in care. We investigated the effect on HIV transmission of bringing patients lost to follow up (LTFU) back into care. DESIGN: Mathematical model. METHODS: Stochastic mathematical model of cohorts of 1000 HIV-infected patients on antiretroviral therapy (ART), based on data from two clinics in Lilongwe, Malawi. We calculated cohort viral load (CVL; sum of individual mean viral loads each year) and used a mathematical relationship between viral load and transmission probability to estimate the number of new HIV infections. We simulated four scenarios: 'no LTFU' (all patients stay in care); 'no tracing' (patients LTFU are not traced); 'immediate tracing' (after missed clinic appointment); and, 'delayed tracing' (after six months). RESULTS: About 440 of 1000 patients were LTFU over five years. CVL (million copies/ml per 1000 patients) were 3.7 (95% prediction interval [PrI] 2.9-4.9) for no LTFU, 8.6 (95% PrI 7.3-10.0) for no tracing, 7.7 (95% PrI 6.2-9.1) for immediate, and 8.0 (95% PrI 6.7-9.5) for delayed tracing. Comparing no LTFU with no tracing the number of new infections increased from 33 (95% PrI 29-38) to 54 (95% PrI 47-60) per 1000 patients. Immediate tracing prevented 3.6 (95% PrI -3.3-12.8) and delayed tracing 2.5 (95% PrI -5.8-11.1) new infections per 1000. Immediate tracing was more efficient than delayed tracing: 116 and to 142 tracing efforts, respectively, were needed to prevent one new infection. CONCLUSION: Tracing of patients LTFU enhances the preventive effect of ART, but the number of transmissions prevented is small.
Resumo:
Aims: Early-generation drug-eluting stent (DES) overlap (OL) is associated with impaired long-term clinical outcomes whereas the impact of OL with newer-generation DES is unknown. Our aim was to assess the impact of OL on long-term clinical outcomes among patients treated with newer-generation DES. Methods and results: We analysed the three-year clinical outcomes of 3,133 patients included in a prospective DES registry according to stent type (sirolimus-eluting stents [SES; N=1,532] versus everolimus-eluting stents [EES; N=1,601]), and the presence or absence of OL. The primary outcome was a composite of death, myocardial infarction (MI), and target vessel revascularisation (TVR). The primary endpoint was more common in patients with OL (25.1%) than in those with multiple DES without OL (20.8%, adj HR=1.46, 95% CI: 1.03-2.09) and patients with a single DES (18.8%, adj HR=1.74, 95% CI: 1.34-2.25, p<0.001) at three years. A stratified analysis by stent type showed a higher risk of the primary outcome in SES with OL (28.7%) compared to other SES groups (without OL: 22.6%, p=0.04; single DES: 17.6%, p<0.001), but not between EES with OL (22.3%) and other EES groups (without OL: 18.5%, p=0.30; single DES: 20.4%, p=0.20). Conclusions: DES overlap is associated with impaired clinical outcomes during long-term follow-up. Compared with SES, EES provide similar clinical outcomes irrespective of DES overlap status.
Resumo:
OBJECTIVE To investigate whether it is valid to combine follow-up and change data when conducting meta-analyses of continuous outcomes. STUDY DESIGN AND SETTING Meta-epidemiological study of randomized controlled trials in patients with osteoarthritis of the knee/hip, which assessed patient-reported pain. We calculated standardized mean differences (SMDs) based on follow-up and change data, and pooled within-trial differences in SMDs. We also derived pooled SMDs indicating the largest treatment effect within a trial (optimistic selection of SMDs) and derived pooled SMDs from the estimate indicating the smallest treatment effect within a trial (pessimistic selection of SMDs). RESULTS A total of 21 meta-analyses with 189 trials with 292 randomized comparisons in 41,256 patients were included. On average, SMDs were 0.04 standard deviation units more beneficial when follow-up values were used (difference in SMDs: -0.04; 95% confidence interval: -0.13, 0.06; P=0.44). In 13 meta-analyses (62%), there was a relevant difference in clinical and/or significance level between optimistic and pessimistic pooled SMDs. CONCLUSION On average, there is no relevant difference between follow-up and change data SMDs, and combining these estimates in meta-analysis is generally valid. Decision on which type of data to use when both follow-up and change data are available should be prespecified in the meta-analysis protocol.
Resumo:
Aims: Newer-generation everolimus-eluting stents (EES) have been shown to improve clinical outcomes compared with early-generation sirolimus-eluting (SES) and paclitaxel-eluting stents (PES) in patients undergoing percutaneous coronary intervention (PCI). Whether this benefit is maintained among patients with saphenous vein graft (SVG) disease remains controversial. Methods and results: We assessed cumulative incidence rates (CIR) per 100 patient years after inverse probability of treatment weighting to compare clinical outcomes. The pre-specified primary endpoint was the composite of cardiac death, myocardial infarction (MI), and target vessel revascularisation (TVR). Out of 12,339 consecutively treated patients, 288 patients (5.7%) underwent PCI of at least one SVG lesion with EES (n=127), SES (n=103) or PES (n=58). Up to four years, CIR of the primary endpoint were 58.7 for EES, 45.2 for SES and 45.6 for PES with similar adjusted risks between groups (EES vs. SES; HR 0.94, 95% CI: 0.55-1.60, EES vs. PES; HR 1.07, 95% CI: 0.60-1.91). Adjusted risks showed no significant differences between stent types for cardiac death, MI and TVR. Conclusions: Among patients undergoing PCI for SVG lesions, newer-generation EES have similar safety and efficacy to early-generation SES and PES during long-term follow-up to four years.
Resumo:
AIMS To assess serially the edge vascular response (EVR) of a bioresorbable vascular scaffold (BVS) compared to a metallic everolimus-eluting stent (EES). METHODS AND RESULTS Non-serial evaluations of the Absorb BVS at one year have previously demonstrated proximal edge constrictive remodelling and distal edge changes in plaque composition with increase of the percent fibro-fatty (FF) tissue component. The 5 mm proximal and distal segments adjacent to the implanted devices were investigated serially with intravascular ultrasound (IVUS), post procedure, at six months and at two years, from the ABSORB Cohort B1 (n=45) and the SPIRIT II (n=113) trials. Twenty-two proximal and twenty-four distal edge segments were available for analysis in the ABSORB Cohort B1 trial. In the SPIRIT II trial, thirty-three proximal and forty-six distal edge segments were analysed. At the 5-mm proximal edge, the vessels treated with an Absorb BVS from post procedure to two years demonstrated a lumen loss (LL) of 6.68% (-17.33; 2.08) (p=0.027) with a trend toward plaque area increase of 7.55% (-4.68; 27.11) (p=0.06). At the 5-mm distal edge no major changes were evident at either time point. At the 5-mm proximal edge the vessels treated with a XIENCE V EES from post procedure to two years did not show any signs of LL, only plaque area decrease of 6.90% (-17.86; 4.23) (p=0.035). At the distal edge no major changes were evident with regard to either lumen area or vessel remodelling at the same time point. CONCLUSIONS The IVUS-based serial evaluation of the EVR up to two years following implantation of a bioresorbable everolimus-eluting scaffold shows a statistically significant proximal edge LL; however, this finding did not seem to have any clinical implications in the serial assessment. The upcoming imaging follow-up of the Absorb BVS at three years is anticipated to provide further information regarding the vessel wall behaviour at the edges.
Resumo:
UNLABELLED The automatic implantable defibrillator (AID) is the treatment of choice for primary and secondary prevention of sudden death. At the Instituto Nacional de Cardiología, since October 1996 until January 2002, 25 patients were implanted with 26 AID. There were 23 men (92%) and the mean age of the whole group, was 51.4 years. Twenty-three patients (92%) presented structural heart disease, the most common was ischemic heart disease in 13 patients (52%), with a mean ejection fraction of 37.8%. One patient without structural heart disease had Brugada Syndrome. The most frequent clinical arrhythmia was ventricular tachycardia in 14 patients (56%). The mean follow-up was of 29.3 months during which a total of 30 events of ventricular arrhythmia were treated through AID; six of them were inappropriate due to paroxismal atrial fibrillation; 10 AID patients (34%) have not applied for therapy. Three patients (12%) of the group died due to congestive heart failure refractory to pharmacologic treatment. CONCLUSION The implant of the AID is a safe and effective measure for primary and secondary prevention of sudden death. World-wide experience evidences, that this kind of device has not modified the mortality rate due to heart failure in these patients, but it has diminished sudden arrhythmic death.
Resumo:
Purpose To determine whether diffusion-weighted (DW) magnetic resonance (MR) imaging in living renal allograft donation allows monitoring of potential changes in the nontransplanted remaining kidney of the donor because of unilateral nephrectomy and changes in the transplanted kidney before and after transplantation in donor and recipient, respectively, and whether DW MR parameters are correlated in the same kidney before and after transplantation. Materials and Methods The study protocol was approved by the local ethics committee; written informed consent was obtained. Thirteen healthy kidney donors and their corresponding recipients prospectively underwent DW MR imaging (multiple b values) in donors before donation and in donors and recipients at day 8 and months 3 and 12 after donation. Total apparent diffusion coefficient (ADCT) values were determined; contribution of microcirculation was quantified in perfusion fraction (FP). Longitudinal changes of diffusion parameters were compared (repeated-measures one-way analysis of variance with post hoc pairwise comparisons). Correlations were tested (linear regression). Results ADCT values in nontransplanted kidney of donors increased from a preexplantation value of (188 ± 9 [standard deviation]) to (202 ± 11) × 10(-5) mm(2)/sec in medulla and from (199 ± 11) to (210 ± 13) × 10(-5) mm(2)/sec in cortex 1 week after donation (P < .004). Medullary, but not cortical, ADCT values stayed increased up to 1 year. ADCT values in allografts in recipients were stable. Compared with values obtained before transplantation in donors, the corticomedullary difference was reduced in allografts (P < .03). Cortical ADCT values correlated with estimated glomerular filtration rate in recipients (R = 0.56, P < .001) but not donors. Cortical ADCT values in the same kidney before transplantation in donors correlated with those in recipients on day 8 after transplantation (R = 0.77, P = .006). FP did not show significant changes. Conclusion DW MR imaging depicts early adaptations in the remaining nontransplanted kidney of donors after nephrectomy. All diffusion parameters remained constant in allograft recipients after transplantation. This method has potential monitoring utility, although assessment of clinical relevance is needed. © RSNA, 2013 Online supplemental material is available for this article.
Resumo:
This study aimed to examine the aetiology of acute diarrhoea and the relapse rate in 100 client-owned dogs presented to a first-opinion clinic. History, physical examination, faecal testing and owner questionnaire data were collected at initial presentation (T0) and at either the time of relapse or at a recheck performed within 3 months. All dogs received treatment according to their clinical signs. Of 96 dogs that completed the study, 37 (38.5%) relapsed during the study period, 21 (21.9%) relapsed within 3 months, and 16 others (16.6%) at 3 months to 1 year after initial examination. Dogs that had undergone a change in housing location within 1 month prior to presentation and dogs <1 year old were significantly more likely to have positive parasitological analyses (P=0.02 and P=0.001, respectively). Pica was a risk factor for relapse (P=0.0002).
Resumo:
The ÆQUAS (a German acronym for “Work Experiences and Quality of Life in Switzerland”) study followed young workers in five occupations over their first ten years in the labor market. Participants of the study reported on working conditions and well-being at five occasions. Overall, resources at work as well as well-being, health and personal resources remained stable or increased. Concurrently, task-related stressors increased as well. This result may reflect career progress (e.g., gaining more responsibilities may be accompanied by increasing time pressure) but development in task-related stressors as well as resources may also be related to specific occupations. Several trajectories had their turning point after the first or second year of being in the labor market, which may reflect a successful professional socialization. Even though a substantial number of participants did change their occupation over these ten years (with benefits for their well-being), development over the first ten years after vocational training implies a successful transition into labor market.
Resumo:
BACKGROUND: Early detection of colorectal cancer through timely follow-up of positive Fecal Occult Blood Tests (FOBTs) remains a challenge. In our previous work, we found 40% of positive FOBT results eligible for colonoscopy had no documented response by a treating clinician at two weeks despite procedures for electronic result notification. We determined if technical and/or workflow-related aspects of automated communication in the electronic health record could lead to the lack of response. METHODS: Using both qualitative and quantitative methods, we evaluated positive FOBT communication in the electronic health record of a large, urban facility between May 2008 and March 2009. We identified the source of test result communication breakdown, and developed an intervention to fix the problem. Explicit medical record reviews measured timely follow-up (defined as response within 30 days of positive FOBT) pre- and post-intervention. RESULTS: Data from 11 interviews and tracking information from 490 FOBT alerts revealed that the software intended to alert primary care practitioners (PCPs) of positive FOBT results was not configured correctly and over a third of positive FOBTs were not transmitted to PCPs. Upon correction of the technical problem, lack of timely follow-up decreased immediately from 29.9% to 5.4% (p<0.01) and was sustained at month 4 following the intervention. CONCLUSION: Electronic communication of positive FOBT results should be monitored to avoid limiting colorectal cancer screening benefits. Robust quality assurance and oversight systems are needed to achieve this. Our methods may be useful for others seeking to improve follow-up of FOBTs in their systems.
Resumo:
BACKGROUND: Given the fragmentation of outpatient care, timely follow-up of abnormal diagnostic imaging results remains a challenge. We hypothesized that an electronic medical record (EMR) that facilitates the transmission and availability of critical imaging results through either automated notification (alerting) or direct access to the primary report would eliminate this problem. METHODS: We studied critical imaging alert notifications in the outpatient setting of a tertiary care Department of Veterans Affairs facility from November 2007 to June 2008. Tracking software determined whether the alert was acknowledged (ie, health care practitioner/provider [HCP] opened the message for viewing) within 2 weeks of transmission; acknowledged alerts were considered read. We reviewed medical records and contacted HCPs to determine timely follow-up actions (eg, ordering a follow-up test or consultation) within 4 weeks of transmission. Multivariable logistic regression models accounting for clustering effect by HCPs analyzed predictors for 2 outcomes: lack of acknowledgment and lack of timely follow-up. RESULTS: Of 123 638 studies (including radiographs, computed tomographic scans, ultrasonograms, magnetic resonance images, and mammograms), 1196 images (0.97%) generated alerts; 217 (18.1%) of these were unacknowledged. Alerts had a higher risk of being unacknowledged when the ordering HCPs were trainees (odds ratio [OR], 5.58; 95% confidence interval [CI], 2.86-10.89) and when dual-alert (>1 HCP alerted) as opposed to single-alert communication was used (OR, 2.02; 95% CI, 1.22-3.36). Timely follow-up was lacking in 92 (7.7% of all alerts) and was similar for acknowledged and unacknowledged alerts (7.3% vs 9.7%; P = .22). Risk for lack of timely follow-up was higher with dual-alert communication (OR, 1.99; 95% CI, 1.06-3.48) but lower when additional verbal communication was used by the radiologist (OR, 0.12; 95% CI, 0.04-0.38). Nearly all abnormal results lacking timely follow-up at 4 weeks were eventually found to have measurable clinical impact in terms of further diagnostic testing or treatment. CONCLUSIONS: Critical imaging results may not receive timely follow-up actions even when HCPs receive and read results in an advanced, integrated electronic medical record system. A multidisciplinary approach is needed to improve patient safety in this area.
Resumo:
Many persons in the U.S. gain weight during young adulthood, and the prevalence of obesity has been increasing among young adults. Although obesity and physical inactivity are generally recognized as risk factors for coronary heart disease (CHD), the magnitude of their effect on risk may have been seriously underestimated due to failure to adequately handle the problem of cigarette smoking. Since cigarette smoking causes weight loss, physically inactive cigarette smokers may remain relatively lean because they smoke cigarettes. We hypothesize cigarette smoking modifies the association between weight gain during young adulthood and risk of coronary heart disease during middle age, and that the true effect of weight gain during young adulthood on risk of CHD can be assessed only in persons who have not smoked cigarettes. Specifically, we hypothesize that weight gain during young adulthood is positively associated with risk of CHD during middle-age in nonsmokers but that the association is much smaller or absent entirely among cigarette smokers. The purpose of this study was to test this hypothesis. The population for analysis was comprised of 1,934 middle-aged, employed men whose average age at the baseline examination was 48.7 years. Information collected at the baseline examinations in 1958 and 1959 included recalled weight at age 20, present weight, height, smoking status, and other CHD risk factors. To decrease the effect of intraindividual variation, the mean values of the 1958 and 1959 baseline examinations were used in analyses. Change in body mass index ($\Delta$BMI) during young adulthood was the primary exposure variable and was measured as BMI at baseline (kg/m$\sp2)$ minus BMI at age 20 (kg/m$\sp2).$ Proportional hazards regression analysis was used to generate relative risks of CHD mortality by category of $\Delta$BMI and cigarette smoking status after adjustment for age, family history of CVD, major organ system disease, BMI at age 20, and number of cigarettes smoked per day. Adjustment was not performed for systolic blood pressure or total serum cholesterol as these were regarded as intervening variables. Vital status was known for all men on the 25th anniversary of their baseline examinations. 705 deaths (including 319 CHD deaths) occurred over 40,136 person-years of experience. $\Delta$BMI was positively associated with risk of CHD mortality in never-smokers, but not in ever-smokers (p for interaction = 0.067). For never-smokers with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 1.62, 1.61, and 2.78, respectively (p for trend = 0.010). For ever-smokers, with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 0.74, 1.07, and 1.06, respectively (p for trend = 0.422). These results support the research hypothesis that cigarette smoking modifies the association between weight gain and CHD mortality. Current estimates of the magnitude of effect of obesity and physical inactivity on risk of coronary mortality may have been seriously underestimated due to inadequate handling of cigarette smoking. ^