14 resultados para United Confederate Veterans. Virginia Division. John Bowie Strange Camp.
em Duke University
Resumo:
UNLABELLED: BACKGROUND: Primary care, an essential determinant of health system equity, efficiency, and effectiveness, is threatened by inadequate supply and distribution of the provider workforce. The Veterans Health Administration (VHA) has been a frontrunner in the use of nurse practitioners (NPs) and physician assistants (PAs). Evaluation of the roles and impact of NPs and PAs in the VHA is critical to ensuring optimal care for veterans and may inform best practices for use of PAs and NPs in other settings around the world. The purpose of this study was to characterize the use of NPs and PAs in VHA primary care and to examine whether their patients and patient care activities were, on average, less medically complex than those of physicians. METHODS: This is a retrospective cross-sectional analysis of administrative data from VHA primary care encounters between 2005 and 2010. Patient and patient encounter characteristics were compared across provider types (PA, NP, and physician). RESULTS: NPs and PAs attend about 30% of all VHA primary care encounters. NPs, PAs, and physicians fill similar roles in VHA primary care, but patients of PAs and NPs are slightly less complex than those of physicians, and PAs attend a higher proportion of visits for the purpose of determining eligibility for benefits. CONCLUSIONS: This study demonstrates that a highly successful nationwide primary care system relies on NPs and PAs to provide over one quarter of primary care visits, and that these visits are similar to those of physicians with regard to patient and encounter characteristics. These findings can inform health workforce solutions to physician shortages in the USA and around the world. Future research should compare the quality and costs associated with various combinations of providers and allocations of patient care work, and should elucidate the approaches that maximize quality and efficiency.
Resumo:
BACKGROUND: Stroke is one of the most disabling and costly impairments of adulthood in the United States. Stroke patients clearly benefit from intensive inpatient care, but due to the high cost, there is considerable interest in implementing interventions to reduce hospital lengths of stay. Early discharge rehabilitation programs require coordinated, well-organized home-based rehabilitation, yet lack of sufficient information about the home setting impedes successful rehabilitation. This trial examines a multifaceted telerehabilitation (TR) intervention that uses telehealth technology to simultaneously evaluate the home environment, assess the patient's mobility skills, initiate rehabilitative treatment, prescribe exercises tailored for stroke patients and provide periodic goal oriented reassessment, feedback and encouragement. METHODS: We describe an ongoing Phase II, 2-arm, 3-site randomized controlled trial (RCT) that determines primarily the effect of TR on physical function and secondarily the effect on disability, falls-related self-efficacy, and patient satisfaction. Fifty participants with a diagnosis of ischemic or hemorrhagic stroke will be randomly assigned to one of two groups: (a) TR; or (b) Usual Care. The TR intervention uses a combination of three videotaped visits and five telephone calls, an in-home messaging device, and additional telephonic contact as needed over a 3-month study period, to provide a progressive rehabilitative intervention with a treatment goal of safe functional mobility of the individual within an accessible home environment. Dependent variables will be measured at baseline, 3-, and 6-months and analyzed with a linear mixed-effects model across all time points. DISCUSSION: For patients recovering from stroke, the use of TR to provide home assessments and follow-up training in prescribed equipment has the potential to effectively supplement existing home health services, assist transition to home and increase efficiency. This may be particularly relevant when patients live in remote locations, as is the case for many veterans. TRIAL REGISTRATION: Clinical Trials.gov Identifier: NCT00384748.
Resumo:
Reactions to stressful negative events have long been studied using approaches based on either the narrative interpretation of the event or the traits of the individual. Here, we integrate these 2 approaches by using individual-differences measures of both the narrative interpretation of the stressful event as central to one's life and the personality characteristic of negative affectivity. We show that they each have independent contributions to stress reactions and that high levels on both produce greater than additive effects. The effects on posttraumatic stress symptoms are substantial for both undergraduates (Study 1, n = 2,296; Study 3, n = 488) and veterans (Study 2, n = 104), with mean levels for participants low on both measures near floor on posttraumatic stress symptoms and those high on both measures scoring at or above diagnostic thresholds. Study 3 included 3 measures of narrative centrality and 3 of negative affectivity to demonstrate that the effects were not limited to a single measure. In Study 4 (n = 987), measures associated with symptoms of posttraumatic stress correlated substantially with either measures of narrative centrality or measures of negative affectivity. The concepts of narrative centrality and negative affectivity and the results are consistent with findings from clinical populations using similar measures and with current approaches to therapy. In broad nonclinical populations, such as those used here, the results suggest that we might be able to substantially increase our ability to account for the severity of stress response by including both concepts.
Resumo:
Fear conditioning is an established model for investigating posttraumatic stress disorder (PTSD). However, symptom triggers may vaguely resemble the initial traumatic event, differing on a variety of sensory and affective dimensions. We extended the fear-conditioning model to assess generalization of conditioned fear on fear processing neurocircuitry in PTSD. Military veterans (n=67) consisting of PTSD (n=32) and trauma-exposed comparison (n=35) groups underwent functional magnetic resonance imaging during fear conditioning to a low fear-expressing face while a neutral face was explicitly unreinforced. Stimuli that varied along a neutral-to-fearful continuum were presented before conditioning to assess baseline responses, and after conditioning to assess experience-dependent changes in neural activity. Compared with trauma-exposed controls, PTSD patients exhibited greater post-study memory distortion of the fear-conditioned stimulus toward the stimulus expressing the highest fear intensity. PTSD patients exhibited biased neural activation toward high-intensity stimuli in fusiform gyrus (P<0.02), insula (P<0.001), primary visual cortex (P<0.05), locus coeruleus (P<0.04), thalamus (P<0.01), and at the trend level in inferior frontal gyrus (P=0.07). All regions except fusiform were moderated by childhood trauma. Amygdala-calcarine (P=0.01) and amygdala-thalamus (P=0.06) functional connectivity selectively increased in PTSD patients for high-intensity stimuli after conditioning. In contrast, amygdala-ventromedial prefrontal cortex (P=0.04) connectivity selectively increased in trauma-exposed controls compared with PTSD patients for low-intensity stimuli after conditioning, representing safety learning. In summary, fear generalization in PTSD is biased toward stimuli with higher emotional intensity than the original conditioned-fear stimulus. Functional brain differences provide a putative neurobiological model for fear generalization whereby PTSD symptoms are triggered by threat cues that merely resemble the index trauma.
Resumo:
OBJECTIVE: The Veterans Health Administration has developed My HealtheVet (MHV), a Web-based portal that links veterans to their care in the veteran affairs (VA) system. The objective of this study was to measure diabetic veterans' access to and use of the Internet, and their interest in using MHV to help manage their diabetes. MATERIALS AND METHODS: Cross-sectional mailed survey of 201 patients with type 2 diabetes and hemoglobin A(1c) > 8.0% receiving primary care at any of five primary care clinic sites affiliated with a VA tertiary care facility. Main measures included Internet usage, access, and attitudes; computer skills; interest in using the Internet; awareness of and attitudes toward MHV; demographics; and socioeconomic status. RESULTS: A majority of respondents reported having access to the Internet at home. Nearly half of all respondents had searched online for information about diabetes, including some who did not have home Internet access. More than a third obtained "some" or "a lot" of their health-related information online. Forty-one percent reported being "very interested" in using MHV to help track their home blood glucose readings, a third of whom did not have home Internet access. Factors associated with being "very interested" were as follows: having access to the Internet at home (p < 0.001), "a lot/some" trust in the Internet as a source of health information (p = 0.002), lower age (p = 0.03), and some college (p = 0.04). Neither race (p = 0.44) nor income (p = 0.25) was significantly associated with interest in MHV. CONCLUSIONS: This study found that a diverse sample of older VA patients with sub-optimally controlled diabetes had a level of familiarity with and access to the Internet comparable to an age-matched national sample. In addition, there was a high degree of interest in using the Internet to help manage their diabetes.
Resumo:
The Veterans Health Administration (VHA) in the Department of Veteran Affairs (VA) has emerged as a national and international leader in the delivery and research of telehealth-based treatment. Several unique characteristics of care in VA settings intersect to create an ideal environment for telehealth modalities and research. However, the value of telehealth experience and initiatives in VA settings is limited if telehealth strategies cannot be widely exported to other public or private systems. Whereas a hierarchical organization, such as VA, can innovate and fund change relatively quickly based on provider and patient preferences and a growing knowledge base, other health provider organizations and third-party payers may likely require replicable scientific findings over time before incremental investments will be made to create infrastructure, reform regulatory barriers, and amend laws to accommodate expansion of telehealth modalities. Accordingly, large-scale scientifically rigorous telehealth research in VHA settings is essential not only to investigate the efficacy of existing and future telehealth practices in VHA, but also to hasten the development of telehealth infrastructure in private and other public health settings. We propose an expanded partnership between the VA, NIH, and other funding agencies to investigate creative and pragmatic uses of telehealth technology. To this end, we identify six specific areas of research we believe to be particularly relevant to the efficient development of telehealth modalities in civilian and military contexts outside VHA.
Resumo:
BACKGROUND: West Virginia has the worst oral health in the United States, but the reasons for this are unclear. This pilot study explored the etiology of this disparity using culture-independent analyses to identify bacterial species associated with oral disease. METHODS: Bacteria in subgingival plaque samples from twelve participants in two independent West Virginia dental-related studies were characterized using 16S rRNA gene sequencing and Human Oral Microbe Identification Microarray (HOMIM) analysis. Unifrac analysis was used to characterize phylogenetic differences between bacterial communities obtained from plaque of participants with low or high oral disease, which was further evaluated using clustering and Principal Coordinate Analysis. RESULTS: Statistically different bacterial signatures (P<0.001) were identified in subgingival plaque of individuals with low or high oral disease in West Virginia based on 16S rRNA gene sequencing. Low disease contained a high frequency of Veillonella and Streptococcus, with a moderate number of Capnocytophaga. High disease exhibited substantially increased bacterial diversity and included a large proportion of Clostridiales cluster bacteria (Selenomonas, Eubacterium, Dialister). Phylogenetic trees constructed using 16S rRNA gene sequencing revealed that Clostridiales were repeated colonizers in plaque associated with high oral disease, providing evidence that the oral environment is somehow influencing the bacterial signature linked to disease. CONCLUSIONS: Culture-independent analyses identified an atypical bacterial signature associated with high oral disease in West Virginians and provided evidence that the oral environment influenced this signature. Both findings provide insight into the etiology of the oral disparity in West Virginia.
Resumo:
Selenium (Se) is a micronutrient necessary for the function of a variety of important enzymes; Se also exhibits a narrow range in concentrations between essentiality and toxicity. Oviparous vertebrates such as birds and fish are especially sensitive to Se toxicity, which causes reproductive impairment and defects in embryo development. Selenium occurs naturally in the Earth's crust, but it can be mobilized by a variety of anthropogenic activities, including agricultural practices, coal burning, and mining.
Mountaintop removal/valley fill (MTR/VF) coal mining is a form of surface mining found throughout central Appalachia in the United States that involves blasting off the tops of mountains to access underlying coal seams. Spoil rock from the mountain is placed into adjacent valleys, forming valley fills, which bury stream headwaters and negatively impact surface water quality. This research focused on the biological impacts of Se leached from MTR/VF coal mining operations located around the Mud River, West Virginia.
In order to assess the status of Se in a lotic (flowing) system such as the Mud River, surface water, insects, and fish samples including creek chub (Semotilus atromaculatus) and green sunfish (Lepomis cyanellus) were collected from a mining impacted site as well as from a reference site not impacted by mining. Analysis of samples from the mined site showed increased conductivity and Se in the surface waters compared to the reference site in addition to increased concentrations of Se in insects and fish. Histological analysis of mined site fish gills showed a lack of normal parasites, suggesting parasite populations may be disrupted due to poor water quality. X-ray absorption near edge spectroscopy techniques were used to determine the speciation of Se in insect and creek chub samples. Insects contained approximately 40-50% inorganic Se (selenate and selenite) and 50-60% organic Se (Se-methionine and Se-cystine) while fish tissues contained lower proportions of inorganic Se than insects, instead having higher proportions of organic Se in the forms of methyl-Se-cysteine, Se-cystine, and Se-methionine.
Otoliths, calcified inner ear structures, were also collected from Mud River creek chubs and green sunfish and analyzed for Se content using laser ablation inductively couple mass spectrometry (LA-ICP-MS). Significant differences were found between the two species of fish, based on the concentrations of otolith Se. Green sunfish otoliths from all sites contained background or low concentrations of otolith Se (< 1 µg/g) that were not significantly different between mined and unmined sites. In contrast creek chub otoliths from the historically mined site contained much higher (≥ 5 µg/g, up to approximately 68 µg/g) concentrations of Se than for the same species in the unmined site or for the green sunfish. Otolith Se concentrations were related to muscle Se concentrations for creek chubs (R2 = 0.54, p = 0.0002 for the last 20% of the otolith Se versus muscle Se) while no relationship was observed for green sunfish.
Additional experiments using biofilms grown in the Mud River showed increased Se in mined site biofilms compared to the reference site. When we fed fathead minnows (Pimephales promelas) on these biofilms in the laboratory they accumulated higher concentrations of Se in liver and ovary tissues compared to fathead minnows fed on reference site biofilms. No differences in Se accumulation were found in muscle from either treatment group. Biofilms were also centrifuged and separated into filamentous green algae and the remaining diatom fraction. The majority of Se was found in the diatom fraction with only about 1/3rd of total biofilm Se concentration present in the filamentous green algae fraction
Finally, zebrafish (Danio rerio) embryos were exposed to aqueous Se in the form of selenate, selenite, and L-selenomethionine in an attempt to determine if oxidative stress plays a role in selenium embryo toxicity. Selenate and selenite exposure did not induce embryo deformities (lordosis and craniofacial malformation). L-selenomethionine, however, induced significantly higher deformity rates at 100 µg/L compared to controls. Antioxidant rescue of L-selenomethionime induced deformities was attempted in embryos using N-acetylcysteine (NAC). Pretreatment with NAC significantly reduced deformities in the zebrafish embryos secondarily treated with L-selenomethionine, suggesting that oxidative stress may play a role in Se toxicity. Selenite exposure also induced a 6.6-fold increase in glutathione-S-transferase pi class 2 gene expression, which is involved in xenobiotic transformation. No changes in gene expression were observed for selenate or L-selenomethionine-exposed embryos.
The findings in this dissertation contribute to the understanding of how Se bioaccumulates in a lotic system and is transferred through a simulated foodweb in addition to further exploring oxidative stress as a potential mechanism for Se-induced embryo toxicity. Future studies should continue to pursue the role of oxidative stress and other mechanisms in Se toxicity and the biotransformation of Se in aquatic ecosystems.
Resumo:
Fifty veterans diagnosed with posttraumatic stress disorder (PTSD) each recalled four autobiographical memories: one from the 2 years before service, one non-combat memory from the time in service, one from combat, and one from service that had often come as an intrusive memory. For each memory, they provided 21 ratings about reliving, belief, sensory properties, reexperiencing emotions, visceral emotional responses, fragmentation, and narrative coherence. We used these ratings to examine three claims about traumatic memories: a separation of cognitive and visceral aspects of emotion, an increased sense of reliving, and increased fragmentation. There was evidence for a partial separation of cognitive judgments of reexperiencing an emotion and reports of visceral symptoms of the emotion, with visceral symptoms correlating more consistently with scores on PTSD tests. Reliving, but not fragmentation of the memories, increased with increases in the trauma relatedness of the event and with increases in scores on standardized tests of PTSD severity. Copyright © 2004 John Wiley & Sons, Ltd.
Resumo:
Our percept of visual stability across saccadic eye movements may be mediated by presaccadic remapping. Just before a saccade, neurons that remap become visually responsive at a future field (FF), which anticipates the saccade vector. Hence, the neurons use corollary discharge of saccades. Many of the neurons also decrease their response at the receptive field (RF). Presaccadic remapping occurs in several brain areas including the frontal eye field (FEF), which receives corollary discharge of saccades in its layer IV from a collicular-thalamic pathway. We studied, at two levels, the microcircuitry of remapping in the FEF. At the laminar level, we compared remapping between layers IV and V. At the cellular level, we compared remapping between different neuron types of layer IV. In the FEF in four monkeys (Macaca mulatta), we identified 27 layer IV neurons with orthodromic stimulation and 57 layer V neurons with antidromic stimulation from the superior colliculus. With the use of established criteria, we classified the layer IV neurons as putative excitatory (n = 11), putative inhibitory (n = 12), or ambiguous (n = 4). We found that just before a saccade, putative excitatory neurons increased their visual response at the RF, putative inhibitory neurons showed no change, and ambiguous neurons increased their visual response at the FF. None of the neurons showed presaccadic visual changes at both RF and FF. In contrast, neurons in layer V showed full remapping (at both the RF and FF). Our data suggest that elemental signals for remapping are distributed across neuron types in early cortical processing and combined in later stages of cortical microcircuitry.
Resumo:
BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.
Resumo:
Emerging evidence suggests that microRNAs can initiate asymmetric division, but whether microRNA and protein cell fate determinants coordinate with each other remains unclear. Here, we show that miR-34a directly suppresses Numb in early-stage colon cancer stem cells (CCSCs), forming an incoherent feedforward loop (IFFL) targeting Notch to separate stem and non-stem cell fates robustly. Perturbation of the IFFL leads to a new intermediate cell population with plastic and ambiguous identity. Lgr5+ mouse intestinal/colon stem cells (ISCs) predominantly undergo symmetric division but turn on asymmetric division to curb the number of ISCs when proinflammatory response causes excessive proliferation. Deletion of miR-34a inhibits asymmetric division and exacerbates Lgr5+ ISC proliferation under such stress. Collectively, our data indicate that microRNA and protein cell fate determinants coordinate to enhance robustness of cell fate decision, and they provide a safeguard mechanism against stem cell proliferation induced by inflammation or oncogenic mutation.
Resumo:
BACKGROUND: The National Comprehensive Cancer Network and the American Society of Clinical Oncology have established guidelines for the treatment and surveillance of colorectal cancer (CRC), respectively. Considering these guidelines, an accurate and efficient method is needed to measure receipt of care. METHODS: The accuracy and completeness of Veterans Health Administration (VA) administrative data were assessed by comparing them with data manually abstracted during the Colorectal Cancer Care Collaborative (C4) quality improvement initiative for 618 patients with stage I-III CRC. RESULTS: The VA administrative data contained gender, marital, and birth information for all patients but race information was missing for 62.1% of patients. The percent agreement for demographic variables ranged from 98.1-100%. The kappa statistic for receipt of treatments ranged from 0.21 to 0.60 and there was a 96.9% agreement for the date of surgical resection. The percentage of post-diagnosis surveillance events in C4 also in VA administrative data were 76.0% for colonoscopy, 84.6% for physician visit, and 26.3% for carcinoembryonic antigen (CEA) test. CONCLUSIONS: VA administrative data are accurate and complete for non-race demographic variables, receipt of CRC treatment, colonoscopy, and physician visits; but alternative data sources may be necessary to capture patient race and receipt of CEA tests.
Resumo:
AIM: To evaluate pretreatment hepatitis B virus (HBV) testing, vaccination, and antiviral treatment rates in Veterans Affairs patients receiving anti-CD20 Ab for quality improvement. METHODS: We performed a retrospective cohort study using a national repository of Veterans Health Administration (VHA) electronic health record data. We identified all patients receiving anti-CD20 Ab treatment (2002-2014). We ascertained patient demographics, laboratory results, HBV vaccination status (from vaccination records), pharmacy data, and vital status. The high risk period for HBV reactivation is during anti-CD20 Ab treatment and 12 mo follow up. Therefore, we analyzed those who were followed to death or for at least 12 mo after completing anti-CD20 Ab. Pretreatment serologic tests were used to categorize chronic HBV (hepatitis B surface antigen positive or HBsAg+), past HBV (HBsAg-, hepatitis B core antibody positive or HBcAb+), resolved HBV (HBsAg-, HBcAb+, hepatitis B surface antibody positive or HBsAb+), likely prior vaccination (isolated HBsAb+), HBV negative (HBsAg-, HBcAb-), or unknown. Acute hepatitis B was defined by the appearance of HBsAg+ in the high risk period in patients who were pretreatment HBV negative. We assessed HBV antiviral treatment and the incidence of hepatitis, liver failure, and death during the high risk period. Cumulative hepatitis, liver failure, and death after anti-CD20 Ab initiation were compared by HBV disease categories and differences compared using the χ(2) test. Mean time to hepatitis peak alanine aminotransferase, liver failure, and death relative to anti-CD20 Ab administration and follow-up were also compared by HBV disease group. RESULTS: Among 19304 VHA patients who received anti-CD20 Ab, 10224 (53%) had pretreatment HBsAg testing during the study period, with 49% and 43% tested for HBsAg and HBcAb, respectively within 6 mo pretreatment in 2014. Of those tested, 2% (167/10224) had chronic HBV, 4% (326/7903) past HBV, 5% (427/8110) resolved HBV, 8% (628/8110) likely prior HBV vaccination, and 76% (6022/7903) were HBV negative. In those with chronic HBV infection, ≤ 37% received HBV antiviral treatment during the high risk period while 21% to 23% of those with past or resolved HBV, respectively, received HBV antiviral treatment. During and 12 mo after anti-CD20 Ab, the rate of hepatitis was significantly greater in those HBV positive vs negative (P = 0.001). The mortality rate was 35%-40% in chronic or past hepatitis B and 26%-31% in hepatitis B negative. In those pretreatment HBV negative, 16 (0.3%) developed acute hepatitis B of 4947 tested during anti-CD20Ab treatment and follow-up. CONCLUSION: While HBV testing of Veterans has increased prior to anti-CD20 Ab, few HBV+ patients received HBV antivirals, suggesting electronic health record algorithms may enhance health outcomes.