241 resultados para follow up
Resumo:
Loss to follow-up (LTFU) is a common problem in many epidemiological studies. In antiretroviral treatment (ART) programs for patients with human immunodeficiency virus (HIV), mortality estimates can be biased if the LTFU mechanism is non-ignorable, that is, mortality differs between lost and retained patients. In this setting, routine procedures for handling missing data may lead to biased estimates. To appropriately deal with non-ignorable LTFU, explicit modeling of the missing data mechanism is needed. This can be based on additional outcome ascertainment for a sample of patients LTFU, for example, through linkage to national registries or through survey-based methods. In this paper, we demonstrate how this additional information can be used to construct estimators based on inverse probability weights (IPW) or multiple imputation. We use simulations to contrast the performance of the proposed estimators with methods widely used in HIV cohort research for dealing with missing data. The practical implications of our approach are illustrated using South African ART data, which are partially linkable to South African national vital registration data. Our results demonstrate that while IPWs and proper imputation procedures can be easily constructed from additional outcome ascertainment to obtain valid overall estimates, neglecting non-ignorable LTFU can result in substantial bias. We believe the proposed estimators are readily applicable to a growing number of studies where LTFU is appreciable, but additional outcome data are available through linkage or surveys of patients LTFU. Copyright © 2013 John Wiley & Sons, Ltd.
Resumo:
OBJECTIVES: Treatment as prevention depends on retaining HIV-infected patients in care. We investigated the effect on HIV transmission of bringing patients lost to follow up (LTFU) back into care. DESIGN: Mathematical model. METHODS: Stochastic mathematical model of cohorts of 1000 HIV-infected patients on antiretroviral therapy (ART), based on data from two clinics in Lilongwe, Malawi. We calculated cohort viral load (CVL; sum of individual mean viral loads each year) and used a mathematical relationship between viral load and transmission probability to estimate the number of new HIV infections. We simulated four scenarios: 'no LTFU' (all patients stay in care); 'no tracing' (patients LTFU are not traced); 'immediate tracing' (after missed clinic appointment); and, 'delayed tracing' (after six months). RESULTS: About 440 of 1000 patients were LTFU over five years. CVL (million copies/ml per 1000 patients) were 3.7 (95% prediction interval [PrI] 2.9-4.9) for no LTFU, 8.6 (95% PrI 7.3-10.0) for no tracing, 7.7 (95% PrI 6.2-9.1) for immediate, and 8.0 (95% PrI 6.7-9.5) for delayed tracing. Comparing no LTFU with no tracing the number of new infections increased from 33 (95% PrI 29-38) to 54 (95% PrI 47-60) per 1000 patients. Immediate tracing prevented 3.6 (95% PrI -3.3-12.8) and delayed tracing 2.5 (95% PrI -5.8-11.1) new infections per 1000. Immediate tracing was more efficient than delayed tracing: 116 and to 142 tracing efforts, respectively, were needed to prevent one new infection. CONCLUSION: Tracing of patients LTFU enhances the preventive effect of ART, but the number of transmissions prevented is small.
Resumo:
BACKGROUND Survival rates in implant dentistry today are high, although late failures do occur for many reasons, including peri-implant infections. The primary objective of this study is to investigate microbiota around single turned implants after 16 to 22 years. Secondary objectives are to compare teeth and implants and to correlate microbiologic, radiographic, and clinical parameters. METHODS A total of 46 patients with single implants were invited for a clinical examination. Clinical data were collected from implants and contralateral natural teeth. Radiographic bone level was measured around implants. Microbiologic samples were taken from implants, contralateral teeth, and the deepest pocket per quadrant. Samples were analyzed with DNA-DNA hybridization including 40 species. Statistical analysis was performed using Wilcoxon signed-rank tests, McNemar tests, and Spearman correlation coefficients with a 0.05 significance level. RESULTS Mean follow-up was 18.5 years (range 16 to 22 years). Tannerella forsythia (1.5 × 10(5)) and Veillonella parvula (1.02 × 10(5)) showed the highest concentrations around implants and teeth, respectively. Porphyromonas gingivalis, Prevotella intermedia, and T. forsythia were significantly more present around implants than teeth. Mean counts were significantly higher around implants than teeth for Parvimonas micra, P. gingivalis, P. intermedia, T. forsythia, and Treponema denticola. Total DNA count was correlated to interproximal bleeding index (r = 0.409) and interproximal probing depth (r = 0.307). No correlations were present with plaque index or radiographic bone level. CONCLUSIONS In the present study, bacterial counts around single implants in periodontally healthy patients are rather low. Although pathogenic bacteria are present, some in higher numbers around implants than teeth (five of 40), the majority of implants present with healthy peri-implant tissues without progressive bone loss.
Resumo:
BACKGROUND Of the approximately 2.4 million American women with a history of breast cancer, 43% are aged ≥ 65 years and are at risk for developing subsequent malignancies. METHODS Women from 6 geographically diverse sites included 5-year breast cancer survivors (N = 1361) who were diagnosed between 1990 and 1994 at age ≥ 65 years with stage I or II disease and a comparison group of women without breast cancer (N = 1361). Women in the comparison group were age-matched and site-matched to breast cancer survivors on the date of breast cancer diagnosis. Follow-up began 5 years after the index date (survivor diagnosis date or comparison enrollment date) until death, disenrollment, or through 15 years after the index date. Data were collected from medical records and electronic sources (cancer registry, administrative, clinical, National Death Index). Analyses included descriptive statistics, crude incidence rates, and Cox proportional hazards regression models for estimating the risk of incident malignancy and were adjusted for death as a competing risk. RESULTS Survivors and women in the comparison group were similar: >82% were white, 55% had a Charlson Comorbidity Index of 0, and ≥ 73% had a body mass index ≤ 30 kg/m(2) . Of all 306 women (N = 160 in the survivor group, N = 146 in the comparison group) who developed a first incident malignancy during follow-up, the mean time to malignancy was similar (4.37 ± 2.81 years vs 4.03 ± 2.76 years, respectively; P = .28), whereas unadjusted incidence rates were slightly higher in survivors (1882 vs 1620 per 100,000 person years). The adjusted hazard of developing a first incident malignancy was slightly elevated in survivors in relation to women in the comparison group, but it was not statistically significant (hazard ratio, 1.17; 95% confidence interval, 0.94-1.47). CONCLUSIONS Older women who survived 5 years after an early stage breast cancer diagnosis were not at an elevated risk for developing subsequent incident malignancies up to 15 years after their breast cancer diagnosis.
Resumo:
OBJECTIVE To investigate whether it is valid to combine follow-up and change data when conducting meta-analyses of continuous outcomes. STUDY DESIGN AND SETTING Meta-epidemiological study of randomized controlled trials in patients with osteoarthritis of the knee/hip, which assessed patient-reported pain. We calculated standardized mean differences (SMDs) based on follow-up and change data, and pooled within-trial differences in SMDs. We also derived pooled SMDs indicating the largest treatment effect within a trial (optimistic selection of SMDs) and derived pooled SMDs from the estimate indicating the smallest treatment effect within a trial (pessimistic selection of SMDs). RESULTS A total of 21 meta-analyses with 189 trials with 292 randomized comparisons in 41,256 patients were included. On average, SMDs were 0.04 standard deviation units more beneficial when follow-up values were used (difference in SMDs: -0.04; 95% confidence interval: -0.13, 0.06; P=0.44). In 13 meta-analyses (62%), there was a relevant difference in clinical and/or significance level between optimistic and pessimistic pooled SMDs. CONCLUSION On average, there is no relevant difference between follow-up and change data SMDs, and combining these estimates in meta-analysis is generally valid. Decision on which type of data to use when both follow-up and change data are available should be prespecified in the meta-analysis protocol.
Resumo:
BACKGROUND An increasing number of childhood cancer survivors need long-term follow-up care. Different models address this problem, including that of follow-up by general practitioners (GP). We describe models that involve GPs in follow-up for childhood cancer survivors, their advantages and disadvantages, clinics that employ these models, and the elements essential to high-quality, GP-led follow-up care. PROCEDURE We searched four databases (PubMed [including Medline], Embase, Cochrane, and CINAHL) without language restrictions. RESULTS We found 26 publications, which explicitly mentioned GP-led follow-up. Two models were commonly described: GP-only, and shared care between GP and pediatric oncology or late effects clinic. The shared care model appears to have advantages over GP-only follow-up. We found four clinics using models of GP-led follow-up, described in five papers. We identified well-organized transition, treatment summary, survivorship care plan, education of GPs and guidelines as necessary components of successful follow-up. CONCLUSION Scarcity of literature necessitated a review rather than a meta-analysis. More research on the outcomes of GP-led care is necessary to confirm the model for follow-up of childhood cancer survivors in the long term. However, with the necessary elements in place, the model of GP-led follow-up, and shared care in particular, holds promise.
Resumo:
AIMS To assess serially the edge vascular response (EVR) of a bioresorbable vascular scaffold (BVS) compared to a metallic everolimus-eluting stent (EES). METHODS AND RESULTS Non-serial evaluations of the Absorb BVS at one year have previously demonstrated proximal edge constrictive remodelling and distal edge changes in plaque composition with increase of the percent fibro-fatty (FF) tissue component. The 5 mm proximal and distal segments adjacent to the implanted devices were investigated serially with intravascular ultrasound (IVUS), post procedure, at six months and at two years, from the ABSORB Cohort B1 (n=45) and the SPIRIT II (n=113) trials. Twenty-two proximal and twenty-four distal edge segments were available for analysis in the ABSORB Cohort B1 trial. In the SPIRIT II trial, thirty-three proximal and forty-six distal edge segments were analysed. At the 5-mm proximal edge, the vessels treated with an Absorb BVS from post procedure to two years demonstrated a lumen loss (LL) of 6.68% (-17.33; 2.08) (p=0.027) with a trend toward plaque area increase of 7.55% (-4.68; 27.11) (p=0.06). At the 5-mm distal edge no major changes were evident at either time point. At the 5-mm proximal edge the vessels treated with a XIENCE V EES from post procedure to two years did not show any signs of LL, only plaque area decrease of 6.90% (-17.86; 4.23) (p=0.035). At the distal edge no major changes were evident with regard to either lumen area or vessel remodelling at the same time point. CONCLUSIONS The IVUS-based serial evaluation of the EVR up to two years following implantation of a bioresorbable everolimus-eluting scaffold shows a statistically significant proximal edge LL; however, this finding did not seem to have any clinical implications in the serial assessment. The upcoming imaging follow-up of the Absorb BVS at three years is anticipated to provide further information regarding the vessel wall behaviour at the edges.
Resumo:
UNLABELLED The automatic implantable defibrillator (AID) is the treatment of choice for primary and secondary prevention of sudden death. At the Instituto Nacional de Cardiología, since October 1996 until January 2002, 25 patients were implanted with 26 AID. There were 23 men (92%) and the mean age of the whole group, was 51.4 years. Twenty-three patients (92%) presented structural heart disease, the most common was ischemic heart disease in 13 patients (52%), with a mean ejection fraction of 37.8%. One patient without structural heart disease had Brugada Syndrome. The most frequent clinical arrhythmia was ventricular tachycardia in 14 patients (56%). The mean follow-up was of 29.3 months during which a total of 30 events of ventricular arrhythmia were treated through AID; six of them were inappropriate due to paroxismal atrial fibrillation; 10 AID patients (34%) have not applied for therapy. Three patients (12%) of the group died due to congestive heart failure refractory to pharmacologic treatment. CONCLUSION The implant of the AID is a safe and effective measure for primary and secondary prevention of sudden death. World-wide experience evidences, that this kind of device has not modified the mortality rate due to heart failure in these patients, but it has diminished sudden arrhythmic death.
Resumo:
BACKGROUND The objective of this study was to compare transtelephonic ECG every 2 days and serial 7-day Holter as two methods of follow-up after atrial fibrillation (AF) catheter ablation for the judgment of ablation success. Patients with highly symptomatic AF are increasingly treated with catheter ablation. Several methods of follow-up have been described, and judgment on ablation success often relies on patients' symptoms. However, the optimal follow-up strategy objectively detecting most of the AF recurrences is yet unclear. METHODS Thirty patients with highly symptomatic AF were selected for circumferential pulmonary vein ablation. During follow-up, a transtelephonic ECG was transmitted once every 2 days for half a year. Additionally, a 7-day Holter was recorded preablation, after ablation, after 3 and 6 months, respectively. With both, procedures symptoms and actual rhythm were correlated thoroughly. RESULTS A total of 2,600 transtelephonic ECGs were collected with 216 of them showing AF. 25% of those episodes were asymptomatic. On a Kaplan-Meier analysis 45% of the patients with paroxysmal AF were still in continuous SR after 6 months. Simulating a follow-up based on symptomatic recurrences only, that number would have increased to 70%. Using serial 7-day ECG, 113 Holter with over 18,900 hours of ECG recording were acquired. After 6 months the percentage of patients classified as free from AF was 50%. Of the patients with recurrences, 30-40% were completely asymptomatic. The percentage of asymptomatic AF episodes stepwise increased from 11% prior ablation to 53% 6 months after. CONCLUSIONS The success rate in terms of freedom from AF was 70% on a symptom-only-based follow-up; using serial 7-day Holter it decreased to 50% and on transtelephonic monitoring to 45%, respectively. Transtelephonic ECG and serial 7-day Holter were equally effective to objectively determine long-term success and to detect asymptomatic patients.
Resumo:
BACKGROUND Trials assessing the benefit of immediate androgen-deprivation therapy (ADT) for treating prostate cancer (PCa) have often done so based on differences in detectable prostate-specific antigen (PSA) relapse or metastatic disease rates at a specific time after randomization. OBJECTIVE Based on the long-term results of European Organization for Research and Treatment of Cancer (EORTC) trial 30891, we questioned if differences in time to progression predict for survival differences. DESIGN, SETTING, AND PARTICIPANTS EORTC trial 30891 compared immediate ADT (n=492) with orchiectomy or luteinizing hormone-releasing hormone analog with deferred ADT (n=493) initiated upon symptomatic disease progression or life-threatening complications in randomly assigned T0-4 N0-2 M0 PCa patients. OUTCOME MEASUREMENTS AND STATISTICAL ANALYSIS Time to first objective progression (documented metastases, ureteric obstruction, not PSA rise) and time to objective castration-resistant progressive disease were compared as well as PCa mortality and overall survival. RESULTS AND LIMITATIONS After a median of 12.8 yr, 769 of the 985 patients had died (78%), 269 of PCa (27%). For patients receiving deferred ADT, the overall treatment time was 31% of that for patients on immediate ADT. Deferred ADT was significantly worse than immediate ADT for time to first objective disease progression (p<0.0001; 10-yr progression rates 42% vs 30%). However, time to objective castration-resistant disease after deferred ADT did not differ significantly (p=0.42) from that after immediate ADT. In addition, PCa mortality did not differ significantly, except in patients with aggressive PCa resulting in death within 3-5 yr after diagnosis. Deferred ADT was inferior to immediate ADT in terms of overall survival (hazard ratio: 1.21; 95% confidence interval, 1.05-1.39; p [noninferiority]=0.72, p [difference] = 0.0085). CONCLUSIONS This study shows that if hormonal manipulation is used at different times during the disease course, differences in time to first disease progression cannot predict differences in disease-specific survival. A deferred ADT policy may substantially reduce the time on treatment, but it is not suitable for patients with rapidly progressing disease.
Resumo:
Purpose To determine whether diffusion-weighted (DW) magnetic resonance (MR) imaging in living renal allograft donation allows monitoring of potential changes in the nontransplanted remaining kidney of the donor because of unilateral nephrectomy and changes in the transplanted kidney before and after transplantation in donor and recipient, respectively, and whether DW MR parameters are correlated in the same kidney before and after transplantation. Materials and Methods The study protocol was approved by the local ethics committee; written informed consent was obtained. Thirteen healthy kidney donors and their corresponding recipients prospectively underwent DW MR imaging (multiple b values) in donors before donation and in donors and recipients at day 8 and months 3 and 12 after donation. Total apparent diffusion coefficient (ADCT) values were determined; contribution of microcirculation was quantified in perfusion fraction (FP). Longitudinal changes of diffusion parameters were compared (repeated-measures one-way analysis of variance with post hoc pairwise comparisons). Correlations were tested (linear regression). Results ADCT values in nontransplanted kidney of donors increased from a preexplantation value of (188 ± 9 [standard deviation]) to (202 ± 11) × 10(-5) mm(2)/sec in medulla and from (199 ± 11) to (210 ± 13) × 10(-5) mm(2)/sec in cortex 1 week after donation (P < .004). Medullary, but not cortical, ADCT values stayed increased up to 1 year. ADCT values in allografts in recipients were stable. Compared with values obtained before transplantation in donors, the corticomedullary difference was reduced in allografts (P < .03). Cortical ADCT values correlated with estimated glomerular filtration rate in recipients (R = 0.56, P < .001) but not donors. Cortical ADCT values in the same kidney before transplantation in donors correlated with those in recipients on day 8 after transplantation (R = 0.77, P = .006). FP did not show significant changes. Conclusion DW MR imaging depicts early adaptations in the remaining nontransplanted kidney of donors after nephrectomy. All diffusion parameters remained constant in allograft recipients after transplantation. This method has potential monitoring utility, although assessment of clinical relevance is needed. © RSNA, 2013 Online supplemental material is available for this article.
Resumo:
This study aimed to examine the aetiology of acute diarrhoea and the relapse rate in 100 client-owned dogs presented to a first-opinion clinic. History, physical examination, faecal testing and owner questionnaire data were collected at initial presentation (T0) and at either the time of relapse or at a recheck performed within 3 months. All dogs received treatment according to their clinical signs. Of 96 dogs that completed the study, 37 (38.5%) relapsed during the study period, 21 (21.9%) relapsed within 3 months, and 16 others (16.6%) at 3 months to 1 year after initial examination. Dogs that had undergone a change in housing location within 1 month prior to presentation and dogs <1 year old were significantly more likely to have positive parasitological analyses (P=0.02 and P=0.001, respectively). Pica was a risk factor for relapse (P=0.0002).
Resumo:
The ÆQUAS (a German acronym for “Work Experiences and Quality of Life in Switzerland”) study followed young workers in five occupations over their first ten years in the labor market. Participants of the study reported on working conditions and well-being at five occasions. Overall, resources at work as well as well-being, health and personal resources remained stable or increased. Concurrently, task-related stressors increased as well. This result may reflect career progress (e.g., gaining more responsibilities may be accompanied by increasing time pressure) but development in task-related stressors as well as resources may also be related to specific occupations. Several trajectories had their turning point after the first or second year of being in the labor market, which may reflect a successful professional socialization. Even though a substantial number of participants did change their occupation over these ten years (with benefits for their well-being), development over the first ten years after vocational training implies a successful transition into labor market.
Resumo:
Recovery from acute episodes of thrombotic thrombocytopenic purpura (TTP) appears complete except for minor cognitive abnormalities and risk for relapse. The Oklahoma TTP-HUS (hemolytic uremic syndrome) Registry enrolled 70 consecutive patients from 1995 to 2011 with ADAMTS13 activity <10% at their initial episode; 57 survived, with follow-up through 2012. The prevalence of body mass index (BMI), glomerular filtration rate (GFR), urine albumin/creatinine ratio (ACR), hypertension, major depression, systemic lupus erythematosus (SLE), and risk of death were compared with expected values based on the US reference population. At initial diagnosis, 57 survivors had a median age of 39 years; 45 (79%) were women; 21 (37%) were black; BMI and prevalence of SLE (7%) were greater (P < .001) than expected; prevalence of hypertension (19%; P = .463) was not different. GFR (P = .397) and ACR (P = .793) were not different from expected values. In 2011-2012, prevalence of hypertension (40% vs 23%; P = .013) and major depression (19% vs 6%; P = .005) was greater than expected values. Eleven patients (19%) have died, a proportion greater than expected compared with US and Oklahoma reference populations (P < .05). TTP survivors may have greater risk for poor health and premature death.