42 resultados para TREATMENT OUTCOME
Resumo:
BACKGROUND: Several observational studies have evaluated the effect of a single exposure window with blood pressure (BP) medications on outcomes in incident dialysis patients, but whether BP medication prescription patterns remain stable or a single exposure window design is adequate to evaluate effect on outcomes is unclear. METHODS: We described patterns of BP medication prescription over 6 months after dialysis initiation in hemodialysis and peritoneal dialysis patients, stratified by cardiovascular comorbidity, diabetes, and other patient characteristics. The cohort included 13,072 adult patients (12,159 hemodialysis, 913 peritoneal dialysis) who initiated dialysis in Dialysis Clinic, Inc., facilities January 1, 2003-June 30, 2008, and remained on the original modality for at least 6 months. We evaluated monthly patterns in BP medication prescription over 6 months and at 12 and 24 months after initiation. RESULTS: Prescription patterns varied by dialysis modality over the first 6 months; substantial proportions of patients with prescriptions for beta-blockers, renin angiotensin system agents, and dihydropyridine calcium channel blockers in month 6 no longer had prescriptions for these medications by month 24. Prescription of specific medication classes varied by comorbidity, race/ethnicity, and age, but little by sex. The mean number of medications was 2.5 at month 6 in hemodialysis and peritoneal dialysis cohorts. CONCLUSIONS: This study evaluates BP medication patterns in both hemodialysis and peritoneal dialysis patients over the first 6 months of dialysis. Our findings highlight the challenges of assessing comparative effectiveness of a single BP medication class in dialysis patients. Longitudinal designs should be used to account for changes in BP medication management over time, and designs that incorporate common combinations should be considered.
Resumo:
BACKGROUND: Evidence is lacking to inform providers' and patients' decisions about many common treatment strategies for patients with end stage renal disease (ESRD). METHODS/DESIGN: The DEcIDE Patient Outcomes in ESRD Study is funded by the United States (US) Agency for Health Care Research and Quality to study the comparative effectiveness of: 1) antihypertensive therapies, 2) early versus later initiation of dialysis, and 3) intravenous iron therapies on clinical outcomes in patients with ESRD. Ongoing studies utilize four existing, nationally representative cohorts of patients with ESRD, including (1) the Choices for Healthy Outcomes in Caring for ESRD study (1041 incident dialysis patients recruited from October 1995 to June 1999 with complete outcome ascertainment through 2009), (2) the Dialysis Clinic Inc (45,124 incident dialysis patients initiating and receiving their care from 2003-2010 with complete outcome ascertainment through 2010), (3) the United States Renal Data System (333,308 incident dialysis patients from 2006-2009 with complete outcome ascertainment through 2010), and (4) the Cleveland Clinic Foundation Chronic Kidney Disease Registry (53,399 patients with chronic kidney disease with outcome ascertainment from 2005 through 2009). We ascertain patient reported outcomes (i.e., health-related quality of life), morbidity, and mortality using clinical and administrative data, and data obtained from national death indices. We use advanced statistical methods (e.g., propensity scoring and marginal structural modeling) to account for potential biases of our study designs. All data are de-identified for analyses. The conduct of studies and dissemination of findings are guided by input from Stakeholders in the ESRD community. DISCUSSION: The DEcIDE Patient Outcomes in ESRD Study will provide needed evidence regarding the effectiveness of common treatments employed for dialysis patients. Carefully planned dissemination strategies to the ESRD community will enhance studies' impact on clinical care and patients' outcomes.
Resumo:
BACKGROUND: Enhanced recovery after surgery (ERAS) is a multimodal approach to perioperative care that combines a range of interventions to enable early mobilization and feeding after surgery. We investigated the feasibility, clinical effectiveness, and cost savings of an ERAS program at a major U. S. teaching hospital. METHODS: Data were collected from consecutive patients undergoing open or laparoscopic colorectal surgery during 2 time periods, before and after implementation of an ERAS protocol. Data collected included patient demographics, operative, and perioperative surgical and anesthesia data, need for analgesics, complications, inpatient medical costs, and 30-day readmission rates. RESULTS: There were 99 patients in the traditional care group, and 142 in the ERAS group. The median length of stay (LOS) was 5 days in the ERAS group compared with 7 days in the traditional group (P < 0.001). The reduction in LOS was significant for both open procedures (median 6 vs 7 days, P = 0.01), and laparoscopic procedures (4 vs 6 days, P < 0.0001). ERAS patients had fewer urinary tract infections (13% vs 24%, P = 0.03). Readmission rates were lower in ERAS patients (9.8% vs 20.2%, P = 0.02). DISCUSSION: Implementation of an enhanced recovery protocol for colorectal surgery at a tertiary medical center was associated with a significantly reduced LOS and incidence of urinary tract infection. This is consistent with that of other studies in the literature and suggests that enhanced recovery programs could be implemented successfully and should be considered in U.S. hospitals.
Resumo:
Prostate growth is dependent on circulating androgens, which can be influenced by hepatic function. Liver disease has been suggested to influence prostate cancer (CaP) incidence. However, the effect of hepatic function on CaP outcomes has not been investigated. A total of 1181 patients who underwent radical prostatectomy (RP) between 1988 and 2008 at four Veterans Affairs hospitals that comprise the Shared Equal Access Regional Cancer Hospital database and had available liver function test (LFT) data were included in the study. Independent associations of LFTs with unfavorable pathological features and biochemical recurrence were determined using logistic and Cox regression analyses. Serum glutamic oxaloacetic transaminase (SGOT) and serum glutamic pyruvic transaminase (SGPT) levels were elevated in 8.2 and 4.4% of patients, respectively. After controlling for CaP features, logistic regression revealed a significant association between SGOT levels and pathological Gleason sum > or =7(4+3) cancer (odds ratio=2.12; 95% confidence interval=1.11-4.05; P=0.02). Mild hepatic dysfunction was significantly associated with adverse CaP grade, but was not significantly associated with other adverse pathological features or biochemical recurrence in a cohort of men undergoing RP. The effect of moderate-to-severe liver disease on disease outcomes in CaP patients managed non-surgically remains to be investigated.
Resumo:
With the lifetime risk of being diagnosed with prostate cancer so great, an effective chemopreventive agent could have a profound impact on the lives of men. Despite decades of searching for such an agent, physicians still do not have an approved drug to offer their patients. In this article, we outline current strategies for preventing prostate cancer in general, with a focus on the 5-α-reductase inhibitors (5-ARIs) finasteride and dutasteride. We discuss the two landmark randomized, controlled trials of finasteride and dutasteride, highlighting the controversies stemming from the results, and address the issue of 5-ARI use, including reasons why providers may be hesitant to use these agents for chemoprevention. We further discuss the recent US Food and Drug Administration ruling against the proposed new indication for dutasteride and the change to the labeling of finasteride, both of which were intended to permit physicians to use the drugs for chemoprevention. Finally, we discuss future directions for 5-ARI research.
Resumo:
Approximately 45,000 individuals are hospitalized annually for burn treatment. Rehabilitation after hospitalization can offer a significant improvement in functional outcomes. Very little is known nationally about rehabilitation for burns, and practices may vary substantially depending on the region based on observed Medicare post-hospitalization spending amounts. This study was designed to measure variation in rehabilitation utilization by state of hospitalization for patients hospitalized with burn injury. This retrospective cohort study used nationally collected data over a 10-year period (2001 to 2010), from the Healthcare Cost and Utilization Project (HCUP) State Inpatient Databases (SIDs). Patients hospitalized for burn injury (n = 57,968) were identified by ICD-9-CM codes and were examined to see specifically if they were discharged immediately to inpatient rehabilitation after hospitalization (primary endpoint). Both unadjusted and adjusted likelihoods were calculated for each state taking into account the effects of age, insurance status, hospitalization at a burn center, and extent of burn injury by TBSA. The relative risk of discharge to inpatient rehabilitation varied by as much as 6-fold among different states. Higher TBSA, having health insurance, higher age, and burn center hospitalization all increased the likelihood of discharge to inpatient rehabilitation following acute care hospitalization. There was significant variation between states in inpatient rehabilitation utilization after adjusting for variables known to affect each outcome. Future efforts should be focused on identifying the cause of this state-to-state variation, its relationship to patient outcome, and standardizing treatment across the United States.
Resumo:
OBJECTIVE: To ascertain the degree of variation, by state of hospitalization, in outcomes associated with traumatic brain injury (TBI) in a pediatric population. DESIGN: A retrospective cohort study of pediatric patients admitted to a hospital with a TBI. SETTING: Hospitals from states in the United States that voluntarily participate in the Agency for Healthcare Research and Quality's Healthcare Cost and Utilization Project. PARTICIPANTS: Pediatric (age ≤ 19 y) patients hospitalized for TBI (N=71,476) in the United States during 2001, 2004, 2007, and 2010. INTERVENTIONS: None. MAIN OUTCOME MEASURES: Primary outcome was proportion of patients discharged to rehabilitation after an acute care hospitalization among alive discharges. The secondary outcome was inpatient mortality. RESULTS: The relative risk of discharge to inpatient rehabilitation varied by as much as 3-fold among the states, and the relative risk of inpatient mortality varied by as much as nearly 2-fold. In the United States, approximately 1981 patients could be discharged to inpatient rehabilitation care if the observed variation in outcomes was eliminated. CONCLUSIONS: There was significant variation between states in both rehabilitation discharge and inpatient mortality after adjusting for variables known to affect each outcome. Future efforts should be focused on identifying the cause of this state-to-state variation, its relationship to patient outcome, and standardizing treatment across the United States.
Resumo:
We evaluated whether Pavlovian conditioning methods could be used to increase the ingestion of non-preferred solutions by formula-fed human infants. In baseline measures, 5-7 month old infants sucked less frequently and consumed less water than regular formula. During a 3-day olfactory conditioning period, parents placed a small scented disk, the conditioned stimulus, on the rim of their infants' formula bottle at every feeding. Following this training, infants' responses to water were tested when their water bottles had a disk scented with the training odor, a novel odor, or no odor. Infants tested with the training odor sucked more frequently and consumed significantly more water than they had at baseline. Infants tested with no odor or a novel odor consumed water at or below baseline levels. These data demonstrate that olfactory conditioning can be used to enhance ingestion in infants and suggest that such methods may be useful for infants experiencing difficulty when making transitions from one diet to another.
Resumo:
BACKGROUND: Several trials have demonstrated the efficacy of nurse telephone case management for diabetes (DM) and hypertension (HTN) in academic or vertically integrated systems. Little is known about the real-world potency of these interventions. OBJECTIVE: To assess the effectiveness of nurse behavioral management of DM and HTN in community practices among patients with both diseases. DESIGN: The study was designed as a patient-level randomized controlled trial. PARTICIPANTS: Participants included adult patients with both type 2 DM and HTN who were receiving care at one of nine community fee-for-service practices. Subjects were required to have inadequately controlled DM (hemoglobin A1c [A1c] ≥ 7.5%) but could have well-controlled HTN. INTERVENTIONS: All patients received a call from a nurse experienced in DM and HTN management once every two months over a period of two years, for a total of 12 calls. Intervention patients received tailored DM- and HTN- focused behavioral content; control patients received non-tailored, non-interactive information regarding health issues unrelated to DM and HTN (e.g., skin cancer prevention). MAIN OUTCOMES AND MEASURES: Systolic blood pressure (SBP) and A1c were co-primary outcomes, measured at 6, 12, and 24 months; 24 months was the primary time point. RESULTS: Three hundred seventy-seven subjects were enrolled; 193 were randomized to intervention, 184 to control. Subjects were 55% female and 50% white; the mean baseline A1c was 9.1% (SD = 1%) and mean SBP was 142 mmHg (SD = 20). Eighty-two percent of scheduled interviews were conducted; 69% of intervention patients and 70% of control patients reached the 24-month time point. Expressing model estimated differences as (intervention--control), at 24 months, intervention patients had similar A1c [diff = 0.1 %, 95 % CI (-0.3, 0.5), p = 0.51] and SBP [diff = -0.9 mmHg, 95% CI (-5.4, 3.5), p = 0.68] values compared to control patients. Likewise, DBP (diff = 0.4 mmHg, p = 0.76), weight (diff = 0.3 kg, p = 0.80), and physical activity levels (diff = 153 MET-min/week, p = 0.41) were similar between control and intervention patients. Results were also similar at the 6- and 12-month time points. CONCLUSIONS: In nine community fee-for-service practices, telephonic nurse case management did not lead to improvement in A1c or SBP. Gains seen in telephonic behavioral self-management interventions in optimal settings may not translate to the wider range of primary care settings.
Resumo:
BACKGROUND: Early preparation for renal replacement therapy (RRT) is recommended for patients with advanced chronic kidney disease (CKD), yet many patients initiate RRT urgently and/or are inadequately prepared. METHODS: We conducted audio-recorded, qualitative, directed telephone interviews of nephrology health care providers (n = 10, nephrologists, physician assistants, and nurses) and primary care physicians (PCPs, n = 4) to identify modifiable challenges to optimal RRT preparation to inform future interventions. We recruited providers from public safety-net hospital-based and community-based nephrology and primary care practices. We asked providers open-ended questions to assess their perceived challenges and their views on the role of PCPs and nephrologist-PCP collaboration in patients' RRT preparation. Two independent and trained abstractors coded transcribed audio-recorded interviews and identified major themes. RESULTS: Nephrology providers identified several factors contributing to patients' suboptimal RRT preparation, including health system resources (e.g., limited time for preparation, referral process delays, and poorly integrated nephrology and primary care), provider skills (e.g., their difficulty explaining CKD to patients), and patient attitudes and cultural differences (e.g., their poor understanding and acceptance of their CKD and its treatment options, their low perceived urgency for RRT preparation; their negative perceptions about RRT, lack of trust, or language differences). PCPs desired more involvement in preparation to ensure RRT transitions could be as "smooth as possible", including providing patients with emotional support, helping patients weigh RRT options, and affirming nephrologist recommendations. Both nephrology providers and PCPs desired improved collaboration, including better information exchange and delineation of roles during the RRT preparation process. CONCLUSIONS: Nephrology and primary care providers identified health system resources, provider skills, and patient attitudes and cultural differences as challenges to patients' optimal RRT preparation. Interventions to improve these factors may improve patients' preparation and initiation of optimal RRTs.
Resumo:
In preventing invasive fungal disease (IFD) in patients with acute myelogenous leukemia (AML) or myelodysplastic syndrome (MDS), clinical trials demonstrated efficacy of posaconazole over fluconazole and itraconazole. However, effectiveness of posaconazole has not been investigated in the United States in real-world setting outside the environment of controlled clinical trial. We performed a single-center, retrospective cohort study of 130 evaluable patients ≥18 years of age admitted to Duke University Hospital between 2004 and 2010 who received either posaconazole or fluconazole as prophylaxis during first induction or first reinduction chemotherapy for AML or MDS. The primary endpoint was possible, probable, or definite breakthrough IFD. Baseline characteristics were well balanced between groups, except that posaconazole recipients received reinduction chemotherapy and cytarabine more frequently. IFD occurred in 17/65 (27.0%) in the fluconazole group and in 6/65 (9.2%) in the posaconazole group (P = 0.012). Definite/probable IFDs occurred in 7 (10.8%) and 0 patients (0%), respectively (P = 0.0013). In multivariate analysis, fluconazole prophylaxis and duration of neutropenia were predictors of IFD. Mortality was similar between groups. This study demonstrates superior effectiveness of posaconazole over fluconazole as prophylaxis of IFD in AML and MDS patients. Such superiority did not translate to reductions in 100-day all-cause mortality.
Resumo:
BACKGROUND: P2Y12 antagonist therapy improves outcomes in acute myocardial infarction (MI) patients. Novel agents in this class are now available in the US. We studied the introduction of prasugrel into contemporary MI practice to understand the appropriateness of its use and assess for changes in antiplatelet management practices. METHODS AND RESULTS: Using ACTION Registry-GWTG (Get-with-the-Guidelines), we evaluated patterns of P2Y12 antagonist use within 24 hours of admission in 100 228 ST elevation myocardial infarction (STEMI) and 158 492 Non-ST elevation myocardial infarction (NSTEMI) patients at 548 hospitals between October 2009 and September 2012. Rates of early P2Y12 antagonist use were approximately 90% among STEMI and 57% among NSTEMI patients. From 2009 to 2012, prasugrel use increased significantly from 3% to 18% (5% to 30% in STEMI; 2% to 10% in NSTEMI; P for trend <0.001 for all). During the same period, we observed a decrease in use of early but not discharge P2Y12 antagonist among NSTEMI patients. Although contraindicated, 3.0% of patients with prior stroke received prasugrel. Prasugrel was used in 1.9% of patients ≥75 years and 4.5% of patients with weight <60 kg. In both STEMI and NSTEMI, prasugrel was most frequently used in patients at the lowest predicted risk for bleeding and mortality. Despite lack of supporting evidence, prasugrel was initiated before cardiac catheterization in 18% of NSTEMI patients. CONCLUSIONS: With prasugrel as an antiplatelet treatment option, contemporary practice shows low uptake of prasugrel and delays in P2Y12 antagonist initiation among NSTEMI patients. We also note concerning evidence of inappropriate use of prasugrel, and inadequate targeting of this more potent therapy to maximize the benefit/risk ratio.
Resumo:
Anesthesia providers in low-income countries may infrequently provide regional anesthesia techniques for obstetrics due to insufficient training and supplies, limited manpower, and a lack of perceived need. In 2007, Kybele, Inc. began a 5-year collaboration in Ghana to improve obstetric anesthesia services. A program was designed to teach spinal anesthesia for cesarean delivery and spinal labor analgesia at Ridge Regional Hospital, Accra, the second largest obstetric unit in Ghana. The use of spinal anesthesia for cesarean delivery increased significantly from 6% in 2006 to 89% in 2009. By 2012, >90% of cesarean deliveries were conducted with spinal anesthesia, despite a doubling of the number performed. A trial of spinal labor analgesia was assessed in a small cohort of parturients with minimal complications; however, protocol deviations were observed. Although subsequent efforts to provide spinal analgesia in the labor ward were hampered by anesthesia provider shortages, spinal anesthesia for cesarean delivery proved to be practical and sustainable.
Resumo:
BACKGROUND: Some of the 600,000 patients with solid organ allotransplants need reconstruction with a composite tissue allotransplant, such as the hand, abdominal wall, or face. The aim of this study was to develop a rat model for assessing the effects of a secondary composite tissue allotransplant on a primary heart allotransplant. METHODS: Hearts of Wistar Kyoto rats were harvested and transplanted heterotopically to the neck of recipient Fisher 344 rats. The anastomoses were performed between the donor brachiocephalic artery and the recipient left common carotid artery, and between the donor pulmonary artery and the recipient external jugular vein. Recipients received cyclosporine A for 10 days only. Heart rate was assessed noninvasively. The sequential composite tissue allotransplant consisted of a 3 x 3-cm abdominal musculocutaneous flap harvested from Lewis rats and transplanted to the abdomen of the heart allotransplant recipients. The abdominal flap vessels were connected to the femoral vessels. No further immunosuppression was administered following the composite tissue allotransplant. Ten days after composite tissue allotransplantation, rejection of the heart and abdominal flap was assessed histologically. RESULTS: The rat survival rate of the two-stage transplant surgery was 80 percent. The transplanted heart rate decreased from 150 +/- 22 beats per minute immediately after transplant to 83 +/- 12 beats per minute on day 20 (10 days after stopping immunosuppression). CONCLUSIONS: This sequential allotransplant model is technically demanding. It will facilitate investigation of the effects of a secondary composite tissue allotransplant following primary solid organ transplantation and could be useful in developing future immunotherapeutic strategies.
Resumo:
BACKGROUND: In patients with myelomeningocele (MMC), a high number of fractures occur in the paralyzed extremities, affecting mobility and independence. The aims of this retrospective cross-sectional study are to determine the frequency of fractures in our patient cohort and to identify trends and risk factors relevant for such fractures. MATERIALS AND METHODS: Between March 1988 and June 2005, 862 patients with MMC were treated at our hospital. The medical records, surgery reports, and X-rays from these patients were evaluated. RESULTS: During the study period, 11% of the patients (n = 92) suffered one or more fractures. Risk analysis showed that patients with MMC and thoracic-level paralysis had a sixfold higher risk of fracture compared with those with sacral-level paralysis. Femoral-neck z-scores measured by dual-energy X-ray absorptiometry (DEXA) differed significantly according to the level of neurological impairment, with lower z-scores in children with a higher level of lesion. Furthermore, the rate of epiphyseal separation increased noticeably after cast immobilization. Mainly patients who could walk relatively well were affected. CONCLUSIONS: Patients with thoracic-level paralysis represent a group with high fracture risk. According to these results, fracture and epiphyseal injury in patients with MMC should be treated by plaster immobilization. The duration of immobilization should be kept to a minimum (<4 weeks) because of increased risk of secondary fractures. Alternatively, patients with refractures can be treated by surgery, when nonoperative treatment has failed.