823 resultados para probabilistic risk assessment
Resumo:
Since at least the early 1990s, stage and risk migration have been seen in patients with prostate cancer, likely corresponding to the institution of prostate specific antigen (PSA) screening in health systems. Preoperative risk factors, including PSA level and clinical stage, have decreased significantly. These improved prognostic variables have led to a larger portion of men being stratified with low-risk disease, as per the classification of D'Amico and associates. This, in turn, has corresponded with more favorable postoperative variables, including decreased extraprostatic tumor extension and prolonged biochemical-free recurrence rates. The advent of focal therapy is bolstered by findings of increased unilateral disease with decreased tumor volume. Increasingly, targeted or delayed therapies may be possible within the current era of lower risk disease.
Resumo:
BACKGROUND: Malignant glioma is a rare cancer with poor survival. The influence of diet and antioxidant intake on glioma survival is not well understood. The current study examines the association between antioxidant intake and survival after glioma diagnosis. METHODS: Adult patients diagnosed with malignant glioma during 1991-1994 and 1997-2001 were enrolled in a population-based study. Diagnosis was confirmed by review of pathology specimens. A modified food-frequency questionnaire interview was completed by each glioma patient or a designated proxy. Intake of each food item was converted to grams consumed/day. From this nutrient database, 16 antioxidants, calcium, a total antioxidant index and 3 macronutrients were available for survival analysis. Cox regression estimated mortality hazard ratios associated with each nutrient and the antioxidant index adjusting for potential confounders. Nutrient values were categorized into tertiles. Models were stratified by histology (Grades II, III, and IV) and conducted for all (including proxy) subjects and for a subset of self-reported subjects. RESULTS: Geometric mean values for 11 fat-soluble and 6 water-soluble individual antioxidants, antioxidant index and 3 macronutrients were virtually the same when comparing all cases (n=748) to self-reported cases only (n=450). For patients diagnosed with Grade II and Grade III histology, moderate (915.8-2118.3 mcg) intake of fat-soluble lycopene was associated with poorer survival when compared to low intake (0.0-914.8 mcg), for self-reported cases only. High intake of vitamin E and moderate/high intake of secoisolariciresinol among Grade III patients indicated greater survival for all cases. In Grade IV patients, moderate/high intake of cryptoxanthin and high intake of secoisolariciresinol were associated with poorer survival among all cases. Among Grade II patients, moderate intake of water-soluble folate was associated with greater survival for all cases; high intake of vitamin C and genistein and the highest level of the antioxidant index were associated with poorer survival for all cases. CONCLUSIONS: The associations observed in our study suggest that the influence of some antioxidants on survival following a diagnosis of malignant glioma are inconsistent and vary by histology group. Further research in a large sample of glioma patients is needed to confirm/refute our results.
Variation in use of surveillance colonoscopy among colorectal cancer survivors in the United States.
Resumo:
BACKGROUND: Clinical practice guidelines recommend colonoscopies at regular intervals for colorectal cancer (CRC) survivors. Using data from a large, multi-regional, population-based cohort, we describe the rate of surveillance colonoscopy and its association with geographic, sociodemographic, clinical, and health services characteristics. METHODS: We studied CRC survivors enrolled in the Cancer Care Outcomes Research and Surveillance (CanCORS) study. Eligible survivors were diagnosed between 2003 and 2005, had curative surgery for CRC, and were alive without recurrences 14 months after surgery with curative intent. Data came from patient interviews and medical record abstraction. We used a multivariate logit model to identify predictors of colonoscopy use. RESULTS: Despite guidelines recommending surveillance, only 49% of the 1423 eligible survivors received a colonoscopy within 14 months after surgery. We observed large regional differences (38% to 57%) across regions. Survivors who received screening colonoscopy were more likely to: have colon cancer than rectal cancer (OR = 1.41, 95% CI: 1.05-1.90); have visited a primary care physician (OR = 1.44, 95% CI: 1.14-1.82); and received adjuvant chemotherapy (OR = 1.75, 95% CI: 1.27-2.41). Compared to survivors with no comorbidities, survivors with moderate or severe comorbidities were less likely to receive surveillance colonoscopy (OR = 0.69, 95% CI: 0.49-0.98 and OR = 0.44, 95% CI: 0.29-0.66, respectively). CONCLUSIONS: Despite guidelines, more than half of CRC survivors did not receive surveillance colonoscopy within 14 months of surgery, with substantial variation by site of care. The association of primary care visits and adjuvant chemotherapy use suggests that access to care following surgery affects cancer surveillance.
Resumo:
BACKGROUND: One year after the introduction of Information and Communication Technology (ICT) to support diagnostic imaging at our hospital, clinicians had faster and better access to radiology reports and images; direct access to Computed Tomography (CT) reports in the Electronic Medical Record (EMR) was particularly popular. The objective of this study was to determine whether improvements in radiology reporting and clinical access to diagnostic imaging information one year after the ICT introduction were associated with a reduction in the length of patients' hospital stays (LOS). METHODS: Data describing hospital stays and diagnostic imaging were collected retrospectively from the EMR during periods of equal duration before and one year after the introduction of ICT. The post-ICT period was chosen because of the documented improvement in clinical access to radiology results during that period. The data set was randomly split into an exploratory part used to establish the hypotheses, and a confirmatory part. The data was used to compare the pre-ICT and post-ICT status, but also to compare differences between groups. RESULTS: There was no general reduction in LOS one year after ICT introduction. However, there was a 25% reduction for one group - patients with CT scans. This group was heterogeneous, covering 445 different primary discharge diagnoses. Analyses of subgroups were performed to reduce the impact of this divergence. CONCLUSION: Our results did not indicate that improved access to radiology results reduced the patients' LOS. There was, however, a significant reduction in LOS for patients undergoing CT scans. Given the clinicians' interest in CT reports and the results of the subgroup analyses, it is likely that improved access to CT reports contributed to this reduction.
Resumo:
BACKGROUND: The respiratory tract is a major target of exposure to air pollutants, and respiratory diseases are associated with both short- and long-term exposures. We hypothesized that improved air quality in North Carolina was associated with reduced rates of death from respiratory diseases in local populations. MATERIALS AND METHODS: We analyzed the trends of emphysema, asthma, and pneumonia mortality and changes of the levels of ozone, sulfur dioxide (SO2), nitrogen dioxide (NO2), carbon monoxide (CO), and particulate matters (PM2.5 and PM10) using monthly data measurements from air-monitoring stations in North Carolina in 1993-2010. The log-linear model was used to evaluate associations between air-pollutant levels and age-adjusted death rates (per 100,000 of population) calculated for 5-year age-groups and for standard 2000 North Carolina population. The studied associations were adjusted by age group-specific smoking prevalence and seasonal fluctuations of disease-specific respiratory deaths. RESULTS: Decline in emphysema deaths was associated with decreasing levels of SO2 and CO in the air, decline in asthma deaths-with lower SO2, CO, and PM10 levels, and decline in pneumonia deaths-with lower levels of SO2. Sensitivity analyses were performed to study potential effects of the change from International Classification of Diseases (ICD)-9 to ICD-10 codes, the effects of air pollutants on mortality during summer and winter, the impact of approach when only the underlying causes of deaths were used, and when mortality and air-quality data were analyzed on the county level. In each case, the results of sensitivity analyses demonstrated stability. The importance of analysis of pneumonia as an underlying cause of death was also highlighted. CONCLUSION: Significant associations were observed between decreasing death rates of emphysema, asthma, and pneumonia and decreases in levels of ambient air pollutants in North Carolina.
Resumo:
Approximately 45,000 individuals are hospitalized annually for burn treatment. Rehabilitation after hospitalization can offer a significant improvement in functional outcomes. Very little is known nationally about rehabilitation for burns, and practices may vary substantially depending on the region based on observed Medicare post-hospitalization spending amounts. This study was designed to measure variation in rehabilitation utilization by state of hospitalization for patients hospitalized with burn injury. This retrospective cohort study used nationally collected data over a 10-year period (2001 to 2010), from the Healthcare Cost and Utilization Project (HCUP) State Inpatient Databases (SIDs). Patients hospitalized for burn injury (n = 57,968) were identified by ICD-9-CM codes and were examined to see specifically if they were discharged immediately to inpatient rehabilitation after hospitalization (primary endpoint). Both unadjusted and adjusted likelihoods were calculated for each state taking into account the effects of age, insurance status, hospitalization at a burn center, and extent of burn injury by TBSA. The relative risk of discharge to inpatient rehabilitation varied by as much as 6-fold among different states. Higher TBSA, having health insurance, higher age, and burn center hospitalization all increased the likelihood of discharge to inpatient rehabilitation following acute care hospitalization. There was significant variation between states in inpatient rehabilitation utilization after adjusting for variables known to affect each outcome. Future efforts should be focused on identifying the cause of this state-to-state variation, its relationship to patient outcome, and standardizing treatment across the United States.
Resumo:
OBJECTIVE: To ascertain the degree of variation, by state of hospitalization, in outcomes associated with traumatic brain injury (TBI) in a pediatric population. DESIGN: A retrospective cohort study of pediatric patients admitted to a hospital with a TBI. SETTING: Hospitals from states in the United States that voluntarily participate in the Agency for Healthcare Research and Quality's Healthcare Cost and Utilization Project. PARTICIPANTS: Pediatric (age ≤ 19 y) patients hospitalized for TBI (N=71,476) in the United States during 2001, 2004, 2007, and 2010. INTERVENTIONS: None. MAIN OUTCOME MEASURES: Primary outcome was proportion of patients discharged to rehabilitation after an acute care hospitalization among alive discharges. The secondary outcome was inpatient mortality. RESULTS: The relative risk of discharge to inpatient rehabilitation varied by as much as 3-fold among the states, and the relative risk of inpatient mortality varied by as much as nearly 2-fold. In the United States, approximately 1981 patients could be discharged to inpatient rehabilitation care if the observed variation in outcomes was eliminated. CONCLUSIONS: There was significant variation between states in both rehabilitation discharge and inpatient mortality after adjusting for variables known to affect each outcome. Future efforts should be focused on identifying the cause of this state-to-state variation, its relationship to patient outcome, and standardizing treatment across the United States.
Resumo:
BACKGROUND: Little is known about the constraints of optimizing health care for prostate cancer survivors in Alaska primary care. OBJECTIVE: To describe the experiences and attitudes of primary care providers within the Alaska Tribal Health System (ATHS) regarding the care of prostate cancer survivors. DESIGN: In late October 2011, we emailed a 22-item electronic survey to 268 ATHS primary care providers regarding the frequency of Prostate Specific Antigen (PSA) monitoring for a hypothetical prostate cancer survivor; who should be responsible for the patient's life-long prostate cancer surveillance; who should support the patient's emotional and medical needs as a survivor; and providers' level of comfort addressing recurrence monitoring, erectile dysfunction, urinary incontinence, androgen deprivation therapy, and emotional needs. We used simple logistic regression to examine the association between provider characteristics and their responses to the survivorship survey items. RESULTS: Of 221 individuals who were successfully contacted, a total of 114 responded (52% response rate). Most ATHS providers indicated they would order a PSA test every 12 months (69%) and believed that, ideally, the hypothetical patient's primary care provider should be responsible for his life-long prostate cancer surveillance (60%). Most providers reported feeling either "moderately" or "very" comfortable addressing topics such as prostate cancer recurrence (59%), erectile dysfunction (64%), urinary incontinence (63%), and emotional needs (61%) with prostate cancer survivors. These results varied somewhat by provider characteristics including female sex, years in practice, and the number of prostate cancer survivors seen in their practice. CONCLUSIONS: These data suggest that most primary care providers in Alaska are poised to assume the care of prostate cancer survivors locally. However, we also found that large minorities of providers do not feel confident in their ability to manage common issues in prostate cancer survivorship, implying that continued access to specialists with more expert knowledge would be beneficial.
Resumo:
The early detection of developmental disorders is key to child outcome, allowing interventions to be initiated which promote development and improve prognosis. Research on autism spectrum disorder (ASD) suggests that behavioral signs can be observed late in the first year of life. Many of these studies involve extensive frame-by-frame video observation and analysis of a child's natural behavior. Although nonintrusive, these methods are extremely time-intensive and require a high level of observer training; thus, they are burdensome for clinical and large population research purposes. This work is a first milestone in a long-term project on non-invasive early observation of children in order to aid in risk detection and research of neurodevelopmental disorders. We focus on providing low-cost computer vision tools to measure and identify ASD behavioral signs based on components of the Autism Observation Scale for Infants (AOSI). In particular, we develop algorithms to measure responses to general ASD risk assessment tasks and activities outlined by the AOSI which assess visual attention by tracking facial features. We show results, including comparisons with expert and nonexpert clinicians, which demonstrate that the proposed computer vision tools can capture critical behavioral observations and potentially augment the clinician's behavioral observations obtained from real in-clinic assessments.
Resumo:
BACKGROUND: Several trials have demonstrated the efficacy of nurse telephone case management for diabetes (DM) and hypertension (HTN) in academic or vertically integrated systems. Little is known about the real-world potency of these interventions. OBJECTIVE: To assess the effectiveness of nurse behavioral management of DM and HTN in community practices among patients with both diseases. DESIGN: The study was designed as a patient-level randomized controlled trial. PARTICIPANTS: Participants included adult patients with both type 2 DM and HTN who were receiving care at one of nine community fee-for-service practices. Subjects were required to have inadequately controlled DM (hemoglobin A1c [A1c] ≥ 7.5%) but could have well-controlled HTN. INTERVENTIONS: All patients received a call from a nurse experienced in DM and HTN management once every two months over a period of two years, for a total of 12 calls. Intervention patients received tailored DM- and HTN- focused behavioral content; control patients received non-tailored, non-interactive information regarding health issues unrelated to DM and HTN (e.g., skin cancer prevention). MAIN OUTCOMES AND MEASURES: Systolic blood pressure (SBP) and A1c were co-primary outcomes, measured at 6, 12, and 24 months; 24 months was the primary time point. RESULTS: Three hundred seventy-seven subjects were enrolled; 193 were randomized to intervention, 184 to control. Subjects were 55% female and 50% white; the mean baseline A1c was 9.1% (SD = 1%) and mean SBP was 142 mmHg (SD = 20). Eighty-two percent of scheduled interviews were conducted; 69% of intervention patients and 70% of control patients reached the 24-month time point. Expressing model estimated differences as (intervention--control), at 24 months, intervention patients had similar A1c [diff = 0.1 %, 95 % CI (-0.3, 0.5), p = 0.51] and SBP [diff = -0.9 mmHg, 95% CI (-5.4, 3.5), p = 0.68] values compared to control patients. Likewise, DBP (diff = 0.4 mmHg, p = 0.76), weight (diff = 0.3 kg, p = 0.80), and physical activity levels (diff = 153 MET-min/week, p = 0.41) were similar between control and intervention patients. Results were also similar at the 6- and 12-month time points. CONCLUSIONS: In nine community fee-for-service practices, telephonic nurse case management did not lead to improvement in A1c or SBP. Gains seen in telephonic behavioral self-management interventions in optimal settings may not translate to the wider range of primary care settings.
Resumo:
BACKGROUND: Early preparation for renal replacement therapy (RRT) is recommended for patients with advanced chronic kidney disease (CKD), yet many patients initiate RRT urgently and/or are inadequately prepared. METHODS: We conducted audio-recorded, qualitative, directed telephone interviews of nephrology health care providers (n = 10, nephrologists, physician assistants, and nurses) and primary care physicians (PCPs, n = 4) to identify modifiable challenges to optimal RRT preparation to inform future interventions. We recruited providers from public safety-net hospital-based and community-based nephrology and primary care practices. We asked providers open-ended questions to assess their perceived challenges and their views on the role of PCPs and nephrologist-PCP collaboration in patients' RRT preparation. Two independent and trained abstractors coded transcribed audio-recorded interviews and identified major themes. RESULTS: Nephrology providers identified several factors contributing to patients' suboptimal RRT preparation, including health system resources (e.g., limited time for preparation, referral process delays, and poorly integrated nephrology and primary care), provider skills (e.g., their difficulty explaining CKD to patients), and patient attitudes and cultural differences (e.g., their poor understanding and acceptance of their CKD and its treatment options, their low perceived urgency for RRT preparation; their negative perceptions about RRT, lack of trust, or language differences). PCPs desired more involvement in preparation to ensure RRT transitions could be as "smooth as possible", including providing patients with emotional support, helping patients weigh RRT options, and affirming nephrologist recommendations. Both nephrology providers and PCPs desired improved collaboration, including better information exchange and delineation of roles during the RRT preparation process. CONCLUSIONS: Nephrology and primary care providers identified health system resources, provider skills, and patient attitudes and cultural differences as challenges to patients' optimal RRT preparation. Interventions to improve these factors may improve patients' preparation and initiation of optimal RRTs.
Resumo:
BACKGROUND: In patients with myelomeningocele (MMC), a high number of fractures occur in the paralyzed extremities, affecting mobility and independence. The aims of this retrospective cross-sectional study are to determine the frequency of fractures in our patient cohort and to identify trends and risk factors relevant for such fractures. MATERIALS AND METHODS: Between March 1988 and June 2005, 862 patients with MMC were treated at our hospital. The medical records, surgery reports, and X-rays from these patients were evaluated. RESULTS: During the study period, 11% of the patients (n = 92) suffered one or more fractures. Risk analysis showed that patients with MMC and thoracic-level paralysis had a sixfold higher risk of fracture compared with those with sacral-level paralysis. Femoral-neck z-scores measured by dual-energy X-ray absorptiometry (DEXA) differed significantly according to the level of neurological impairment, with lower z-scores in children with a higher level of lesion. Furthermore, the rate of epiphyseal separation increased noticeably after cast immobilization. Mainly patients who could walk relatively well were affected. CONCLUSIONS: Patients with thoracic-level paralysis represent a group with high fracture risk. According to these results, fracture and epiphyseal injury in patients with MMC should be treated by plaster immobilization. The duration of immobilization should be kept to a minimum (<4 weeks) because of increased risk of secondary fractures. Alternatively, patients with refractures can be treated by surgery, when nonoperative treatment has failed.
Resumo:
BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.
Resumo:
Subteratogenic and other low-level chronic exposures to toxicant mixtures are an understudied threat to environmental and human health. It is especially important to understand the effects of these exposures for contaminants, such as polycyclic aromatic hydrocarbons (PAHs) a large group of more than 100 individual compounds, which are important environmental (including aquatic) contaminants. Aquatic sediments constitute a major sink for hydrophobic pollutants, and studies show PAHs can persist in sediments over time. Furthermore, estuarine systems (namely breeding grounds) are of particular concern, as they are highly impacted by a wide variety of pollutants, and estuarine fishes are often exposed to some of the highest levels of contaminants of any vertebrate taxon. Acute embryonic exposure to PAHs results in cardiac teratogenesis in fish, and early life exposure to certain individual PAHs and PAH mixtures cause heart alterations with decreased swimming capacity in adult fish. Consequently, the heart and cardiorespiratory system are thought to be targets of PAH mixture exposure. While many studies have investigated acute, teratogenic PAH exposures, few studies have longitudinally examined the impacts of subtle, subteratogenic PAH mixture exposures, which are arguably more broadly applicable to environmental contamination scenarios. The goal of this dissertation was to highlight the later-life consequences of early-life exposure to subteratogenic concentrations of a complex, environmentally relevant PAH mixture.
A unique population of Fundulus heteroclitus (the Atlantic killifish or mummichog, hereafter referred to as killifish), has adapted to creosote-based polycyclic aromatic hydrocarbons (PAHs) found at the Atlantic Wood Industries (AW) Superfund site in the southern branch of the Elizabeth River, VA, USA. This killifish population survives in a site heavily contaminated with a mixture of PAHs from former creosote operations. They have developed resistance to the acute toxicity and teratogenic effects caused by the mixture of PAHs in sediment from the site. The primary goal of this dissertation was to compare and contrast later-life outcomes of early-life, subteratogenic PAH mixture exposure in both the Atlantic Wood killifish (AW) and a naïve reference population of killifish from King’s Creek (KC; a relatively uncontaminated tributary of the Severn River, VA). Killifish from both populations were exposed to subteratogenic concentrations of a complex PAH-sediment extract, Elizabeth River Sediment Extract (ERSE), made by collecting sediment from the AW site. Fish were reared over a 5-month period in the laboratory, during which they were examined for a variety of molecular, physiological and behavioral responses.
The central aims of my dissertation were to determine alterations to embryonic gene expression, larval swimming activity, adult behavior, heart structure, enzyme activity, and swimming/cardiorespiratory performance following subteratogenic exposure to ERSE. I hypothesized that subteratogenic exposure to ERSE would impair cardiac ontogenic processes in a way that would be detectable via gene expression in embryos, and that the misregulation of cardiac genes would help to explain activity changes, behavioral deficits, and later-life swimming deficiencies. I also hypothesized that fish heart structure would be altered. In addition, I hypothesized that the AW killifish population would be resistant to developmental exposures and perform normally in later life challenges. To investigate these hypotheses, a series of experiments were carried out in PAH-adapted killifish from Elizabeth River and in reference killifish. As an ancillary project to the primary aims of the dissertation, I examined the toxicity of weaker aryl hydrocarbon receptor (AHR) agonists in combination with fluoranthene (FL), an inhibitor of cytochrome P4501A1 (CYP1A1). This side project was conducted in both Danio rerio (zebrafish) and the KC and AW killifish.
Embryonic gene expression was measured in both killifish populations over an ERSE dose response with multiple time points (12, 24, 48, and 144 hours post exposure). Genes known to play critical roles in cardiac structure/development, cardiac function, and angiogenesis were elevated, indicating cardiac damage and activation of cardiovascular repair mechanisms. These data helped to inform later-life swimming performance and cardiac histology studies. Behavior was assessed during light and dark cycles in larvae of both populations following developmental exposure to ERSE. While KC killifish showed activity differences following exposure, AW killifish showed no significant changes even at concentrations that would cause overt cardiac toxicity in KC killifish. Juvenile behavior experiments demonstrated hyperactivity following ERSE exposure in KC killifish, but no significant behavioral changes in AW killifish. Adult swimming performance via prolonged critical swimming capacity (Ucrit) demonstrated performance costs in the AW killifish. Furthermore, swimming performance decline was observed in KC killifish following exposure to increasing dilutions of ERSE. Lastly, cardiac histology suggested that early-life exposure to ERSE could result in cardiac structural alteration and extravasation of blood into the pericardial cavity.
Responses to AHR agonists resulted in a ranking of relative potency for agonists, and determined which agonists, when combined with FL, caused cardiac teratogenesis. These experiments showed interesting species differences for zebrafish and killifish. To probe mechanisms responsible for cardiotoxicity, a CYP1A-morpholino and a AHR2-morpholino were used to mimic FL effects or attempt to rescue cardiac deformities respectively. Findings suggested that the cardiac toxicity elicited by weak agonist + FL exposure was likely driven by AHR-independent mechanisms. These studies stand in contrast to previous research from our lab showing that moderate AHR agonist + FL caused cardiac toxicity that can be partially rescued by AHR-morpholino knockdown.
My findings will form better characterization of mechanisms of PAH toxicity, and advance our understanding of how subteratogenic mixtures of PAHs exert their toxic action in naïve killifish. Furthermore, these studies will provide a framework for investigating how subteratogenic exposures to PAH mixtures can impact aquatic organismal health and performance. Most importantly, these experiments have the potential to help inform risk assessment in fish, mammals, and potentially humans. Ultimately, this research will help protect populations exposed to subtle PAH-contamination.
Resumo:
BACKGROUND: Risk assessment with a thorough family health history is recommended by numerous organizations and is now a required component of the annual physical for Medicare beneficiaries under the Affordable Care Act. However, there are several barriers to incorporating robust risk assessments into routine care. MeTree, a web-based patient-facing health risk assessment tool, was developed with the aim of overcoming these barriers. In order to better understand what factors will be instrumental for broader adoption of risk assessment programs like MeTree in clinical settings, we obtained funding to perform a type III hybrid implementation-effectiveness study in primary care clinics at five diverse healthcare systems. Here, we describe the study's protocol. METHODS/DESIGN: MeTree collects personal medical information and a three-generation family health history from patients on 98 conditions. Using algorithms built entirely from current clinical guidelines, it provides clinical decision support to providers and patients on 30 conditions. All adult patients with an upcoming well-visit appointment at one of the 20 intervention clinics are eligible to participate. Patient-oriented risk reports are provided in real time. Provider-oriented risk reports are uploaded to the electronic medical record for review at the time of the appointment. Implementation outcomes are enrollment rate of clinics, providers, and patients (enrolled vs approached) and their representativeness compared to the underlying population. Primary effectiveness outcomes are the percent of participants newly identified as being at increased risk for one of the clinical decision support conditions and the percent with appropriate risk-based screening. Secondary outcomes include percent change in those meeting goals for a healthy lifestyle (diet, exercise, and smoking). Outcomes are measured through electronic medical record data abstraction, patient surveys, and surveys/qualitative interviews of clinical staff. DISCUSSION: This study evaluates factors that are critical to successful implementation of a web-based risk assessment tool into routine clinical care in a variety of healthcare settings. The result will identify resource needs and potential barriers and solutions to implementation in each setting as well as an understanding potential effectiveness. TRIAL REGISTRATION: NCT01956773.