947 resultados para Follow up
Resumo:
BACKGROUND The objective of this study was to compare transtelephonic ECG every 2 days and serial 7-day Holter as two methods of follow-up after atrial fibrillation (AF) catheter ablation for the judgment of ablation success. Patients with highly symptomatic AF are increasingly treated with catheter ablation. Several methods of follow-up have been described, and judgment on ablation success often relies on patients' symptoms. However, the optimal follow-up strategy objectively detecting most of the AF recurrences is yet unclear. METHODS Thirty patients with highly symptomatic AF were selected for circumferential pulmonary vein ablation. During follow-up, a transtelephonic ECG was transmitted once every 2 days for half a year. Additionally, a 7-day Holter was recorded preablation, after ablation, after 3 and 6 months, respectively. With both, procedures symptoms and actual rhythm were correlated thoroughly. RESULTS A total of 2,600 transtelephonic ECGs were collected with 216 of them showing AF. 25% of those episodes were asymptomatic. On a Kaplan-Meier analysis 45% of the patients with paroxysmal AF were still in continuous SR after 6 months. Simulating a follow-up based on symptomatic recurrences only, that number would have increased to 70%. Using serial 7-day ECG, 113 Holter with over 18,900 hours of ECG recording were acquired. After 6 months the percentage of patients classified as free from AF was 50%. Of the patients with recurrences, 30-40% were completely asymptomatic. The percentage of asymptomatic AF episodes stepwise increased from 11% prior ablation to 53% 6 months after. CONCLUSIONS The success rate in terms of freedom from AF was 70% on a symptom-only-based follow-up; using serial 7-day Holter it decreased to 50% and on transtelephonic monitoring to 45%, respectively. Transtelephonic ECG and serial 7-day Holter were equally effective to objectively determine long-term success and to detect asymptomatic patients.
Resumo:
BACKGROUND Trials assessing the benefit of immediate androgen-deprivation therapy (ADT) for treating prostate cancer (PCa) have often done so based on differences in detectable prostate-specific antigen (PSA) relapse or metastatic disease rates at a specific time after randomization. OBJECTIVE Based on the long-term results of European Organization for Research and Treatment of Cancer (EORTC) trial 30891, we questioned if differences in time to progression predict for survival differences. DESIGN, SETTING, AND PARTICIPANTS EORTC trial 30891 compared immediate ADT (n=492) with orchiectomy or luteinizing hormone-releasing hormone analog with deferred ADT (n=493) initiated upon symptomatic disease progression or life-threatening complications in randomly assigned T0-4 N0-2 M0 PCa patients. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Time to first objective progression (documented metastases, ureteric obstruction, not PSA rise) and time to objective castration-resistant progressive disease were compared as well as PCa mortality and overall survival. RESULTS AND LIMITATIONS After a median of 12.8 yr, 769 of the 985 patients had died (78%), 269 of PCa (27%). For patients receiving deferred ADT, the overall treatment time was 31% of that for patients on immediate ADT. Deferred ADT was significantly worse than immediate ADT for time to first objective disease progression (p<0.0001; 10-yr progression rates 42% vs 30%). However, time to objective castration-resistant disease after deferred ADT did not differ significantly (p=0.42) from that after immediate ADT. In addition, PCa mortality did not differ significantly, except in patients with aggressive PCa resulting in death within 3-5 yr after diagnosis. Deferred ADT was inferior to immediate ADT in terms of overall survival (hazard ratio: 1.21; 95% confidence interval, 1.05-1.39; p [noninferiority]=0.72, p [difference] = 0.0085). CONCLUSIONS This study shows that if hormonal manipulation is used at different times during the disease course, differences in time to first disease progression cannot predict differences in disease-specific survival. A deferred ADT policy may substantially reduce the time on treatment, but it is not suitable for patients with rapidly progressing disease.
Resumo:
Purpose To determine whether diffusion-weighted (DW) magnetic resonance (MR) imaging in living renal allograft donation allows monitoring of potential changes in the nontransplanted remaining kidney of the donor because of unilateral nephrectomy and changes in the transplanted kidney before and after transplantation in donor and recipient, respectively, and whether DW MR parameters are correlated in the same kidney before and after transplantation. Materials and Methods The study protocol was approved by the local ethics committee; written informed consent was obtained. Thirteen healthy kidney donors and their corresponding recipients prospectively underwent DW MR imaging (multiple b values) in donors before donation and in donors and recipients at day 8 and months 3 and 12 after donation. Total apparent diffusion coefficient (ADCT) values were determined; contribution of microcirculation was quantified in perfusion fraction (FP). Longitudinal changes of diffusion parameters were compared (repeated-measures one-way analysis of variance with post hoc pairwise comparisons). Correlations were tested (linear regression). Results ADCT values in nontransplanted kidney of donors increased from a preexplantation value of (188 ± 9 [standard deviation]) to (202 ± 11) × 10(-5) mm(2)/sec in medulla and from (199 ± 11) to (210 ± 13) × 10(-5) mm(2)/sec in cortex 1 week after donation (P < .004). Medullary, but not cortical, ADCT values stayed increased up to 1 year. ADCT values in allografts in recipients were stable. Compared with values obtained before transplantation in donors, the corticomedullary difference was reduced in allografts (P < .03). Cortical ADCT values correlated with estimated glomerular filtration rate in recipients (R = 0.56, P < .001) but not donors. Cortical ADCT values in the same kidney before transplantation in donors correlated with those in recipients on day 8 after transplantation (R = 0.77, P = .006). FP did not show significant changes. Conclusion DW MR imaging depicts early adaptations in the remaining nontransplanted kidney of donors after nephrectomy. All diffusion parameters remained constant in allograft recipients after transplantation. This method has potential monitoring utility, although assessment of clinical relevance is needed. © RSNA, 2013 Online supplemental material is available for this article.
Resumo:
This study aimed to examine the aetiology of acute diarrhoea and the relapse rate in 100 client-owned dogs presented to a first-opinion clinic. History, physical examination, faecal testing and owner questionnaire data were collected at initial presentation (T0) and at either the time of relapse or at a recheck performed within 3 months. All dogs received treatment according to their clinical signs. Of 96 dogs that completed the study, 37 (38.5%) relapsed during the study period, 21 (21.9%) relapsed within 3 months, and 16 others (16.6%) at 3 months to 1 year after initial examination. Dogs that had undergone a change in housing location within 1 month prior to presentation and dogs <1 year old were significantly more likely to have positive parasitological analyses (P=0.02 and P=0.001, respectively). Pica was a risk factor for relapse (P=0.0002).
Resumo:
The ÆQUAS (a German acronym for “Work Experiences and Quality of Life in Switzerland”) study followed young workers in five occupations over their first ten years in the labor market. Participants of the study reported on working conditions and well-being at five occasions. Overall, resources at work as well as well-being, health and personal resources remained stable or increased. Concurrently, task-related stressors increased as well. This result may reflect career progress (e.g., gaining more responsibilities may be accompanied by increasing time pressure) but development in task-related stressors as well as resources may also be related to specific occupations. Several trajectories had their turning point after the first or second year of being in the labor market, which may reflect a successful professional socialization. Even though a substantial number of participants did change their occupation over these ten years (with benefits for their well-being), development over the first ten years after vocational training implies a successful transition into labor market.
Resumo:
Recovery from acute episodes of thrombotic thrombocytopenic purpura (TTP) appears complete except for minor cognitive abnormalities and risk for relapse. The Oklahoma TTP-HUS (hemolytic uremic syndrome) Registry enrolled 70 consecutive patients from 1995 to 2011 with ADAMTS13 activity <10% at their initial episode; 57 survived, with follow-up through 2012. The prevalence of body mass index (BMI), glomerular filtration rate (GFR), urine albumin/creatinine ratio (ACR), hypertension, major depression, systemic lupus erythematosus (SLE), and risk of death were compared with expected values based on the US reference population. At initial diagnosis, 57 survivors had a median age of 39 years; 45 (79%) were women; 21 (37%) were black; BMI and prevalence of SLE (7%) were greater (P < .001) than expected; prevalence of hypertension (19%; P = .463) was not different. GFR (P = .397) and ACR (P = .793) were not different from expected values. In 2011-2012, prevalence of hypertension (40% vs 23%; P = .013) and major depression (19% vs 6%; P = .005) was greater than expected values. Eleven patients (19%) have died, a proportion greater than expected compared with US and Oklahoma reference populations (P < .05). TTP survivors may have greater risk for poor health and premature death.
Resumo:
BACKGROUND: Early detection of colorectal cancer through timely follow-up of positive Fecal Occult Blood Tests (FOBTs) remains a challenge. In our previous work, we found 40% of positive FOBT results eligible for colonoscopy had no documented response by a treating clinician at two weeks despite procedures for electronic result notification. We determined if technical and/or workflow-related aspects of automated communication in the electronic health record could lead to the lack of response. METHODS: Using both qualitative and quantitative methods, we evaluated positive FOBT communication in the electronic health record of a large, urban facility between May 2008 and March 2009. We identified the source of test result communication breakdown, and developed an intervention to fix the problem. Explicit medical record reviews measured timely follow-up (defined as response within 30 days of positive FOBT) pre- and post-intervention. RESULTS: Data from 11 interviews and tracking information from 490 FOBT alerts revealed that the software intended to alert primary care practitioners (PCPs) of positive FOBT results was not configured correctly and over a third of positive FOBTs were not transmitted to PCPs. Upon correction of the technical problem, lack of timely follow-up decreased immediately from 29.9% to 5.4% (p<0.01) and was sustained at month 4 following the intervention. CONCLUSION: Electronic communication of positive FOBT results should be monitored to avoid limiting colorectal cancer screening benefits. Robust quality assurance and oversight systems are needed to achieve this. Our methods may be useful for others seeking to improve follow-up of FOBTs in their systems.
Resumo:
BACKGROUND: Given the fragmentation of outpatient care, timely follow-up of abnormal diagnostic imaging results remains a challenge. We hypothesized that an electronic medical record (EMR) that facilitates the transmission and availability of critical imaging results through either automated notification (alerting) or direct access to the primary report would eliminate this problem. METHODS: We studied critical imaging alert notifications in the outpatient setting of a tertiary care Department of Veterans Affairs facility from November 2007 to June 2008. Tracking software determined whether the alert was acknowledged (ie, health care practitioner/provider [HCP] opened the message for viewing) within 2 weeks of transmission; acknowledged alerts were considered read. We reviewed medical records and contacted HCPs to determine timely follow-up actions (eg, ordering a follow-up test or consultation) within 4 weeks of transmission. Multivariable logistic regression models accounting for clustering effect by HCPs analyzed predictors for 2 outcomes: lack of acknowledgment and lack of timely follow-up. RESULTS: Of 123 638 studies (including radiographs, computed tomographic scans, ultrasonograms, magnetic resonance images, and mammograms), 1196 images (0.97%) generated alerts; 217 (18.1%) of these were unacknowledged. Alerts had a higher risk of being unacknowledged when the ordering HCPs were trainees (odds ratio [OR], 5.58; 95% confidence interval [CI], 2.86-10.89) and when dual-alert (>1 HCP alerted) as opposed to single-alert communication was used (OR, 2.02; 95% CI, 1.22-3.36). Timely follow-up was lacking in 92 (7.7% of all alerts) and was similar for acknowledged and unacknowledged alerts (7.3% vs 9.7%; P = .22). Risk for lack of timely follow-up was higher with dual-alert communication (OR, 1.99; 95% CI, 1.06-3.48) but lower when additional verbal communication was used by the radiologist (OR, 0.12; 95% CI, 0.04-0.38). Nearly all abnormal results lacking timely follow-up at 4 weeks were eventually found to have measurable clinical impact in terms of further diagnostic testing or treatment. CONCLUSIONS: Critical imaging results may not receive timely follow-up actions even when HCPs receive and read results in an advanced, integrated electronic medical record system. A multidisciplinary approach is needed to improve patient safety in this area.
Resumo:
Many persons in the U.S. gain weight during young adulthood, and the prevalence of obesity has been increasing among young adults. Although obesity and physical inactivity are generally recognized as risk factors for coronary heart disease (CHD), the magnitude of their effect on risk may have been seriously underestimated due to failure to adequately handle the problem of cigarette smoking. Since cigarette smoking causes weight loss, physically inactive cigarette smokers may remain relatively lean because they smoke cigarettes. We hypothesize cigarette smoking modifies the association between weight gain during young adulthood and risk of coronary heart disease during middle age, and that the true effect of weight gain during young adulthood on risk of CHD can be assessed only in persons who have not smoked cigarettes. Specifically, we hypothesize that weight gain during young adulthood is positively associated with risk of CHD during middle-age in nonsmokers but that the association is much smaller or absent entirely among cigarette smokers. The purpose of this study was to test this hypothesis. The population for analysis was comprised of 1,934 middle-aged, employed men whose average age at the baseline examination was 48.7 years. Information collected at the baseline examinations in 1958 and 1959 included recalled weight at age 20, present weight, height, smoking status, and other CHD risk factors. To decrease the effect of intraindividual variation, the mean values of the 1958 and 1959 baseline examinations were used in analyses. Change in body mass index ($\Delta$BMI) during young adulthood was the primary exposure variable and was measured as BMI at baseline (kg/m$\sp2)$ minus BMI at age 20 (kg/m$\sp2).$ Proportional hazards regression analysis was used to generate relative risks of CHD mortality by category of $\Delta$BMI and cigarette smoking status after adjustment for age, family history of CVD, major organ system disease, BMI at age 20, and number of cigarettes smoked per day. Adjustment was not performed for systolic blood pressure or total serum cholesterol as these were regarded as intervening variables. Vital status was known for all men on the 25th anniversary of their baseline examinations. 705 deaths (including 319 CHD deaths) occurred over 40,136 person-years of experience. $\Delta$BMI was positively associated with risk of CHD mortality in never-smokers, but not in ever-smokers (p for interaction = 0.067). For never-smokers with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 1.62, 1.61, and 2.78, respectively (p for trend = 0.010). For ever-smokers, with $\Delta$BMI of stable, low gain, moderate gain, and high gain, adjusted relative risks were 1.00, 0.74, 1.07, and 1.06, respectively (p for trend = 0.422). These results support the research hypothesis that cigarette smoking modifies the association between weight gain and CHD mortality. Current estimates of the magnitude of effect of obesity and physical inactivity on risk of coronary mortality may have been seriously underestimated due to inadequate handling of cigarette smoking. ^
Resumo:
Purpose: To assess liver remnant volume regeneration and maintenance, and complications in the long-time follow-up of donors after living donor liver transplantation using CT and MRI. Materials and Methods: 47 donors with a mean age of 33.5 years who donated liver tissue for transplantation and who were available for follow-up imaging were included in this retrospective study. Contrast-enhanced CT and MR studies were acquired for routine follow-up. Two observers evaluated pre- and postoperative images regarding anatomy and pathological findings. Volumes were manually measured on contrast-enhanced images in the portal venous phase, and potential postoperative complications were documented. Pre- and postoperative liver volumes were compared for evaluating liver remnant regeneration. Results: 47 preoperative and 89 follow-up studies covered a period of 22.4 months (range: 1 - 84). After right liver lobe (RLL) donation, the mean liver remnant volume was 522.0 ml (± 144.0; 36.1 %; n = 18), after left lateral section (LLS) donation 1,121.7 ml (± 212.8; 79.9 %; n = 24), and after left liver lobe (LLL) donation 1,181.5 ml (± 279.5; 72.0 %; n = 5). Twelve months after donation, the liver remnant volume were 87.3 % (RLL; ± 11.8; n = 11), 95.0 % (LS; ± 11.6; n = 18), and 80.1 % (LLL; ± 2.0; n = 2 LLL) of the preoperative total liver volume. Rapid initial regeneration and maintenance at 80 % of the preoperative liver volume were observed over the total follow-up period. Minor postoperative complications were found early in 4 patients. No severe or late complications or mortality occurred. Conclusion: Rapid regeneration of liver remnant volumes in all donors and volume maintenance over the long-term follow-up period of up to 84 months without severe or late complications are important observations for assessing the safety of LDLT donors. Key Points: Liver remnant volumes of LDLT donors rapidly regenerated after donation and volumes were maintained over the long-term follow-up period of up to 84 months without severe or late complications.
Resumo:
BACKGROUND: Little is known on the "very" long-term incidence of major adverse cardiac events (MACE), target-lesion revascularization (TLR), target-vessel revascularization and stent thrombosis after sirolimus-eluting stent (SES) implantation. We present the first study to provide a 10-year clinical follow-up in an unselected patient population who underwent SES implantation. METHODS AND RESULTS: We ran a systematic 10-year clinical follow-up in a series of 200 consecutive patients treated with unrestricted SES implantation between April 2002 and April 2003 in two Swiss hospitals. Outcomes and follow-up were obtained in all 200 patients. The cumulative 10-year MACE rate was 47% with all-cause death of 20%, cardiac death of 9%, myocardial infarction of 7%, TLR and target-vessel revascularization of 8% and 11% respectively. Academic Research Consortium-defined "definite and probable" stent thrombosis-rate was 2.5%. TLR risk was maximal between 3 to 6 years. New lesion revascularization increased throughout the study period. CONCLUSION: Incidence of TLR was maximal 3 to 6 years after SES implantation and decreased thereafter. MACE and non-TLR revascularization rates steadily increased during the complete follow-up underlining the progression of coronary artery disease.
Resumo:
OBJECTIVE The ACCESS treatment model offers assertive community treatment embedded in an integrated care program to patients with psychoses. Compared to standard care and within a controlled study, it proved to be more effective in terms of service disengagement and illness outcomes in patients with schizophrenia spectrum disorders over 12 months. ACCESS was implemented into clinical routine and its effectiveness assessed over 24 months in severe schizophrenia spectrum disorders and bipolar I disorder with psychotic features (DSM-IV) in a cohort study. METHOD All 115 patients treated in ACCESS (from May 2007 to October 2009) were included in the ACCESS II study. The primary outcome was rate of service disengagement. Secondary outcomes were change of psychopathology, severity of illness, psychosocial functioning, quality of life, satisfaction with care, medication nonadherence, length of hospital stay, and rates of involuntary hospitalization. RESULTS Only 4 patients (3.4%) disengaged with the service. Another 11 (9.6%) left because they moved outside the catchment area. Patients received a mean of 1.6 outpatient contacts per week. Involuntary admissions decreased from 34.8% in the 2 previous years to 7.8% during ACCESS (P < .001). Mixed models repeated-measures analyses revealed significant improvements among all patients in psychopathology (effect size d = 0.64, P < .001), illness severity (d = 0.84, P = .03), functioning level (d = 0.65, P < .001), quality of life (d = 0.50, P < .001), and client satisfaction (d = 0.11, P < .001). At 24 months, 78.3% were fully adherent to medication, compared to 25.2% at baseline (P = .002). CONCLUSIONS ACCESS was successfully implemented in clinical routine and maintained excellent rates of service engagement and other outcomes in patients with schizophrenia spectrum disorders or bipolar I disorder with psychotic features over 24 months. TRIAL REGISTRATION ClinicalTrials.gov identifier: NCT01888627.