35 resultados para United Confederate Veterans


Relevância:

100.00% 100.00%

Publicador:

Resumo:

UNLABELLED: BACKGROUND: Primary care, an essential determinant of health system equity, efficiency, and effectiveness, is threatened by inadequate supply and distribution of the provider workforce. The Veterans Health Administration (VHA) has been a frontrunner in the use of nurse practitioners (NPs) and physician assistants (PAs). Evaluation of the roles and impact of NPs and PAs in the VHA is critical to ensuring optimal care for veterans and may inform best practices for use of PAs and NPs in other settings around the world. The purpose of this study was to characterize the use of NPs and PAs in VHA primary care and to examine whether their patients and patient care activities were, on average, less medically complex than those of physicians. METHODS: This is a retrospective cross-sectional analysis of administrative data from VHA primary care encounters between 2005 and 2010. Patient and patient encounter characteristics were compared across provider types (PA, NP, and physician). RESULTS: NPs and PAs attend about 30% of all VHA primary care encounters. NPs, PAs, and physicians fill similar roles in VHA primary care, but patients of PAs and NPs are slightly less complex than those of physicians, and PAs attend a higher proportion of visits for the purpose of determining eligibility for benefits. CONCLUSIONS: This study demonstrates that a highly successful nationwide primary care system relies on NPs and PAs to provide over one quarter of primary care visits, and that these visits are similar to those of physicians with regard to patient and encounter characteristics. These findings can inform health workforce solutions to physician shortages in the USA and around the world. Future research should compare the quality and costs associated with various combinations of providers and allocations of patient care work, and should elucidate the approaches that maximize quality and efficiency.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

BACKGROUND: Stroke is one of the most disabling and costly impairments of adulthood in the United States. Stroke patients clearly benefit from intensive inpatient care, but due to the high cost, there is considerable interest in implementing interventions to reduce hospital lengths of stay. Early discharge rehabilitation programs require coordinated, well-organized home-based rehabilitation, yet lack of sufficient information about the home setting impedes successful rehabilitation. This trial examines a multifaceted telerehabilitation (TR) intervention that uses telehealth technology to simultaneously evaluate the home environment, assess the patient's mobility skills, initiate rehabilitative treatment, prescribe exercises tailored for stroke patients and provide periodic goal oriented reassessment, feedback and encouragement. METHODS: We describe an ongoing Phase II, 2-arm, 3-site randomized controlled trial (RCT) that determines primarily the effect of TR on physical function and secondarily the effect on disability, falls-related self-efficacy, and patient satisfaction. Fifty participants with a diagnosis of ischemic or hemorrhagic stroke will be randomly assigned to one of two groups: (a) TR; or (b) Usual Care. The TR intervention uses a combination of three videotaped visits and five telephone calls, an in-home messaging device, and additional telephonic contact as needed over a 3-month study period, to provide a progressive rehabilitative intervention with a treatment goal of safe functional mobility of the individual within an accessible home environment. Dependent variables will be measured at baseline, 3-, and 6-months and analyzed with a linear mixed-effects model across all time points. DISCUSSION: For patients recovering from stroke, the use of TR to provide home assessments and follow-up training in prescribed equipment has the potential to effectively supplement existing home health services, assist transition to home and increase efficiency. This may be particularly relevant when patients live in remote locations, as is the case for many veterans. TRIAL REGISTRATION: Clinical Trials.gov Identifier: NCT00384748.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reactions to stressful negative events have long been studied using approaches based on either the narrative interpretation of the event or the traits of the individual. Here, we integrate these 2 approaches by using individual-differences measures of both the narrative interpretation of the stressful event as central to one's life and the personality characteristic of negative affectivity. We show that they each have independent contributions to stress reactions and that high levels on both produce greater than additive effects. The effects on posttraumatic stress symptoms are substantial for both undergraduates (Study 1, n = 2,296; Study 3, n = 488) and veterans (Study 2, n = 104), with mean levels for participants low on both measures near floor on posttraumatic stress symptoms and those high on both measures scoring at or above diagnostic thresholds. Study 3 included 3 measures of narrative centrality and 3 of negative affectivity to demonstrate that the effects were not limited to a single measure. In Study 4 (n = 987), measures associated with symptoms of posttraumatic stress correlated substantially with either measures of narrative centrality or measures of negative affectivity. The concepts of narrative centrality and negative affectivity and the results are consistent with findings from clinical populations using similar measures and with current approaches to therapy. In broad nonclinical populations, such as those used here, the results suggest that we might be able to substantially increase our ability to account for the severity of stress response by including both concepts.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Fear conditioning is an established model for investigating posttraumatic stress disorder (PTSD). However, symptom triggers may vaguely resemble the initial traumatic event, differing on a variety of sensory and affective dimensions. We extended the fear-conditioning model to assess generalization of conditioned fear on fear processing neurocircuitry in PTSD. Military veterans (n=67) consisting of PTSD (n=32) and trauma-exposed comparison (n=35) groups underwent functional magnetic resonance imaging during fear conditioning to a low fear-expressing face while a neutral face was explicitly unreinforced. Stimuli that varied along a neutral-to-fearful continuum were presented before conditioning to assess baseline responses, and after conditioning to assess experience-dependent changes in neural activity. Compared with trauma-exposed controls, PTSD patients exhibited greater post-study memory distortion of the fear-conditioned stimulus toward the stimulus expressing the highest fear intensity. PTSD patients exhibited biased neural activation toward high-intensity stimuli in fusiform gyrus (P<0.02), insula (P<0.001), primary visual cortex (P<0.05), locus coeruleus (P<0.04), thalamus (P<0.01), and at the trend level in inferior frontal gyrus (P=0.07). All regions except fusiform were moderated by childhood trauma. Amygdala-calcarine (P=0.01) and amygdala-thalamus (P=0.06) functional connectivity selectively increased in PTSD patients for high-intensity stimuli after conditioning. In contrast, amygdala-ventromedial prefrontal cortex (P=0.04) connectivity selectively increased in trauma-exposed controls compared with PTSD patients for low-intensity stimuli after conditioning, representing safety learning. In summary, fear generalization in PTSD is biased toward stimuli with higher emotional intensity than the original conditioned-fear stimulus. Functional brain differences provide a putative neurobiological model for fear generalization whereby PTSD symptoms are triggered by threat cues that merely resemble the index trauma.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

OBJECTIVE: The Veterans Health Administration has developed My HealtheVet (MHV), a Web-based portal that links veterans to their care in the veteran affairs (VA) system. The objective of this study was to measure diabetic veterans' access to and use of the Internet, and their interest in using MHV to help manage their diabetes. MATERIALS AND METHODS: Cross-sectional mailed survey of 201 patients with type 2 diabetes and hemoglobin A(1c) > 8.0% receiving primary care at any of five primary care clinic sites affiliated with a VA tertiary care facility. Main measures included Internet usage, access, and attitudes; computer skills; interest in using the Internet; awareness of and attitudes toward MHV; demographics; and socioeconomic status. RESULTS: A majority of respondents reported having access to the Internet at home. Nearly half of all respondents had searched online for information about diabetes, including some who did not have home Internet access. More than a third obtained "some" or "a lot" of their health-related information online. Forty-one percent reported being "very interested" in using MHV to help track their home blood glucose readings, a third of whom did not have home Internet access. Factors associated with being "very interested" were as follows: having access to the Internet at home (p < 0.001), "a lot/some" trust in the Internet as a source of health information (p = 0.002), lower age (p = 0.03), and some college (p = 0.04). Neither race (p = 0.44) nor income (p = 0.25) was significantly associated with interest in MHV. CONCLUSIONS: This study found that a diverse sample of older VA patients with sub-optimally controlled diabetes had a level of familiarity with and access to the Internet comparable to an age-matched national sample. In addition, there was a high degree of interest in using the Internet to help manage their diabetes.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Veterans Health Administration (VHA) in the Department of Veteran Affairs (VA) has emerged as a national and international leader in the delivery and research of telehealth-based treatment. Several unique characteristics of care in VA settings intersect to create an ideal environment for telehealth modalities and research. However, the value of telehealth experience and initiatives in VA settings is limited if telehealth strategies cannot be widely exported to other public or private systems. Whereas a hierarchical organization, such as VA, can innovate and fund change relatively quickly based on provider and patient preferences and a growing knowledge base, other health provider organizations and third-party payers may likely require replicable scientific findings over time before incremental investments will be made to create infrastructure, reform regulatory barriers, and amend laws to accommodate expansion of telehealth modalities. Accordingly, large-scale scientifically rigorous telehealth research in VHA settings is essential not only to investigate the efficacy of existing and future telehealth practices in VHA, but also to hasten the development of telehealth infrastructure in private and other public health settings. We propose an expanded partnership between the VA, NIH, and other funding agencies to investigate creative and pragmatic uses of telehealth technology. To this end, we identify six specific areas of research we believe to be particularly relevant to the efficient development of telehealth modalities in civilian and military contexts outside VHA.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: Automated reporting of estimated glomerular filtration rate (eGFR) is a recent advance in laboratory information technology (IT) that generates a measure of kidney function with chemistry laboratory results to aid early detection of chronic kidney disease (CKD). Because accurate diagnosis of CKD is critical to optimal medical decision-making, several clinical practice guidelines have recommended the use of automated eGFR reporting. Since its introduction, automated eGFR reporting has not been uniformly implemented by U. S. laboratories despite the growing prevalence of CKD. CKD is highly prevalent within the Veterans Health Administration (VHA), and implementation of automated eGFR reporting within this integrated healthcare system has the potential to improve care. In July 2004, the VHA adopted automated eGFR reporting through a system-wide mandate for software implementation by individual VHA laboratories. This study examines the timing of software implementation by individual VHA laboratories and factors associated with implementation. METHODS: We performed a retrospective observational study of laboratories in VHA facilities from July 2004 to September 2009. Using laboratory data, we identified the status of implementation of automated eGFR reporting for each facility and the time to actual implementation from the date the VHA adopted its policy for automated eGFR reporting. Using survey and administrative data, we assessed facility organizational characteristics associated with implementation of automated eGFR reporting via bivariate analyses. RESULTS: Of 104 VHA laboratories, 88% implemented automated eGFR reporting in existing laboratory IT systems by the end of the study period. Time to initial implementation ranged from 0.2 to 4.0 years with a median of 1.8 years. All VHA facilities with on-site dialysis units implemented the eGFR software (52%, p<0.001). Other organizational characteristics were not statistically significant. CONCLUSIONS: The VHA did not have uniform implementation of automated eGFR reporting across its facilities. Facility-level organizational characteristics were not associated with implementation, and this suggests that decisions for implementation of this software are not related to facility-level quality improvement measures. Additional studies on implementation of laboratory IT, such as automated eGFR reporting, could identify factors that are related to more timely implementation and lead to better healthcare delivery.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

BACKGROUND: The National Comprehensive Cancer Network and the American Society of Clinical Oncology have established guidelines for the treatment and surveillance of colorectal cancer (CRC), respectively. Considering these guidelines, an accurate and efficient method is needed to measure receipt of care. METHODS: The accuracy and completeness of Veterans Health Administration (VA) administrative data were assessed by comparing them with data manually abstracted during the Colorectal Cancer Care Collaborative (C4) quality improvement initiative for 618 patients with stage I-III CRC. RESULTS: The VA administrative data contained gender, marital, and birth information for all patients but race information was missing for 62.1% of patients. The percent agreement for demographic variables ranged from 98.1-100%. The kappa statistic for receipt of treatments ranged from 0.21 to 0.60 and there was a 96.9% agreement for the date of surgical resection. The percentage of post-diagnosis surveillance events in C4 also in VA administrative data were 76.0% for colonoscopy, 84.6% for physician visit, and 26.3% for carcinoembryonic antigen (CEA) test. CONCLUSIONS: VA administrative data are accurate and complete for non-race demographic variables, receipt of CRC treatment, colonoscopy, and physician visits; but alternative data sources may be necessary to capture patient race and receipt of CEA tests.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

AIM: To evaluate pretreatment hepatitis B virus (HBV) testing, vaccination, and antiviral treatment rates in Veterans Affairs patients receiving anti-CD20 Ab for quality improvement. METHODS: We performed a retrospective cohort study using a national repository of Veterans Health Administration (VHA) electronic health record data. We identified all patients receiving anti-CD20 Ab treatment (2002-2014). We ascertained patient demographics, laboratory results, HBV vaccination status (from vaccination records), pharmacy data, and vital status. The high risk period for HBV reactivation is during anti-CD20 Ab treatment and 12 mo follow up. Therefore, we analyzed those who were followed to death or for at least 12 mo after completing anti-CD20 Ab. Pretreatment serologic tests were used to categorize chronic HBV (hepatitis B surface antigen positive or HBsAg+), past HBV (HBsAg-, hepatitis B core antibody positive or HBcAb+), resolved HBV (HBsAg-, HBcAb+, hepatitis B surface antibody positive or HBsAb+), likely prior vaccination (isolated HBsAb+), HBV negative (HBsAg-, HBcAb-), or unknown. Acute hepatitis B was defined by the appearance of HBsAg+ in the high risk period in patients who were pretreatment HBV negative. We assessed HBV antiviral treatment and the incidence of hepatitis, liver failure, and death during the high risk period. Cumulative hepatitis, liver failure, and death after anti-CD20 Ab initiation were compared by HBV disease categories and differences compared using the χ(2) test. Mean time to hepatitis peak alanine aminotransferase, liver failure, and death relative to anti-CD20 Ab administration and follow-up were also compared by HBV disease group. RESULTS: Among 19304 VHA patients who received anti-CD20 Ab, 10224 (53%) had pretreatment HBsAg testing during the study period, with 49% and 43% tested for HBsAg and HBcAb, respectively within 6 mo pretreatment in 2014. Of those tested, 2% (167/10224) had chronic HBV, 4% (326/7903) past HBV, 5% (427/8110) resolved HBV, 8% (628/8110) likely prior HBV vaccination, and 76% (6022/7903) were HBV negative. In those with chronic HBV infection, ≤ 37% received HBV antiviral treatment during the high risk period while 21% to 23% of those with past or resolved HBV, respectively, received HBV antiviral treatment. During and 12 mo after anti-CD20 Ab, the rate of hepatitis was significantly greater in those HBV positive vs negative (P = 0.001). The mortality rate was 35%-40% in chronic or past hepatitis B and 26%-31% in hepatitis B negative. In those pretreatment HBV negative, 16 (0.3%) developed acute hepatitis B of 4947 tested during anti-CD20Ab treatment and follow-up. CONCLUSION: While HBV testing of Veterans has increased prior to anti-CD20 Ab, few HBV+ patients received HBV antivirals, suggesting electronic health record algorithms may enhance health outcomes.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Knowing one's HIV status is particularly important in the setting of recent tuberculosis (TB) exposure. Blood tests for assessment of tuberculosis infection, such as the QuantiFERON Gold in-tube test (QFT; Cellestis Limited, Carnegie, Victoria, Australia), offer the possibility of simultaneous screening for TB and HIV with a single blood draw. We performed a cross-sectional analysis of all contacts to a highly infectious TB case in a large meatpacking factory. Twenty-two percent were foreign-born and 73% were black. Contacts were tested with both tuberculin skin testing (TST) and QFT. HIV testing was offered on an opt-out basis. Persons with TST >or=10 mm, positive QFT, and/or positive HIV test were offered latent TB treatment. Three hundred twenty-six contacts were screened: TST results were available for 266 people and an additional 24 reported a prior positive TST for a total of 290 persons with any TST result (89.0%). Adequate QFT specimens were obtained for 312 (95.7%) of persons. Thirty-two persons had QFT results but did not return for TST reading. Twenty-two percent met the criteria for latent TB infection. Eighty-eight percent accepted HIV testing. Two (0.7%) were HIV seropositive; both individuals were already aware of their HIV status, but one had stopped care a year previously. None of the HIV-seropositive persons had latent TB, but all were offered latent TB treatment per standard guidelines. This demonstrates that opt-out HIV testing combined with QFT in a large TB contact investigation was feasible and useful. HIV testing was also widely accepted. Pairing QFT with opt-out HIV testing should be strongly considered when possible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The best wind sites in the United States are often located far from electricity demand centers and lack transmission access. Local sites that have lower quality wind resources but do not require as much power transmission capacity are an alternative to distant wind resources. In this paper, we explore the trade-offs between developing new wind generation at local sites and installing wind farms at remote sites. We first examine the general relationship between the high capital costs required for local wind development and the relatively lower capital costs required to install a wind farm capable of generating the same electrical output at a remote site,with the results representing the maximum amount an investor should be willing to pay for transmission access. We suggest that this analysis can be used as a first step in comparing potential wind resources to meet a state renewable portfolio standard (RPS). To illustrate, we compare the cost of local wind (∼50 km from the load) to the cost of distant wind requiring new transmission (∼550-750 km from the load) to meet the Illinois RPS. We find that local, lower capacity factor wind sites are the lowest cost option for meeting the Illinois RPS if new long distance transmission is required to access distant, higher capacity factor wind resources. If higher capacity wind sites can be connected to the existing grid at minimal cost, in many cases they will have lower costs.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The variability of summer precipitation in the southeastern United States is examined in this study using 60-yr (1948-2007) rainfall data. The Southeast summer rainfalls exhibited higher interannual variability with more intense summer droughts and anomalous wetness in the recent 30 years (1978-2007) than in the prior 30 years (1948-77). Such intensification of summer rainfall variability was consistent with a decrease of light (0.1-1 mm day-1) and medium (1-10 mm day-1) rainfall events during extremely dry summers and an increase of heavy (.10 mm day-1) rainfall events in extremely wet summers. Changes in rainfall variability were also accompanied by a southward shift of the region of maximum zonal wind variability at the jet stream level in the latter period. The covariability between the Southeast summer precipitation and sea surface temperatures (SSTs) is also analyzed using the singular value decomposition (SVD) method. It is shown that the increase of Southeast summer precipitation variability is primarily associated with a higher SST variability across the equatorial Atlantic and also SST warming in the Atlantic. © 2010 American Meteorological Society.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Clinical practice guidelines recommend colonoscopies at regular intervals for colorectal cancer (CRC) survivors. Using data from a large, multi-regional, population-based cohort, we describe the rate of surveillance colonoscopy and its association with geographic, sociodemographic, clinical, and health services characteristics. METHODS: We studied CRC survivors enrolled in the Cancer Care Outcomes Research and Surveillance (CanCORS) study. Eligible survivors were diagnosed between 2003 and 2005, had curative surgery for CRC, and were alive without recurrences 14 months after surgery with curative intent. Data came from patient interviews and medical record abstraction. We used a multivariate logit model to identify predictors of colonoscopy use. RESULTS: Despite guidelines recommending surveillance, only 49% of the 1423 eligible survivors received a colonoscopy within 14 months after surgery. We observed large regional differences (38% to 57%) across regions. Survivors who received screening colonoscopy were more likely to: have colon cancer than rectal cancer (OR = 1.41, 95% CI: 1.05-1.90); have visited a primary care physician (OR = 1.44, 95% CI: 1.14-1.82); and received adjuvant chemotherapy (OR = 1.75, 95% CI: 1.27-2.41). Compared to survivors with no comorbidities, survivors with moderate or severe comorbidities were less likely to receive surveillance colonoscopy (OR = 0.69, 95% CI: 0.49-0.98 and OR = 0.44, 95% CI: 0.29-0.66, respectively). CONCLUSIONS: Despite guidelines, more than half of CRC survivors did not receive surveillance colonoscopy within 14 months of surgery, with substantial variation by site of care. The association of primary care visits and adjuvant chemotherapy use suggests that access to care following surgery affects cancer surveillance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

BACKGROUND: Many patients with diabetes have poor blood pressure (BP) control. Pharmacological therapy is the cornerstone of effective BP treatment, yet there are high rates both of poor medication adherence and failure to intensify medications. Successful medication management requires an effective partnership between providers who initiate and increase doses of effective medications and patients who adhere to the regimen. METHODS: In this cluster-randomized controlled effectiveness study, primary care teams within sites were randomized to a program led by a clinical pharmacist trained in motivational interviewing-based behavioral counseling approaches and authorized to make BP medication changes or to usual care. This study involved the collection of data during a 14-month intervention period in three Department of Veterans Affairs facilities and two Kaiser Permanente Northern California facilities. The clinical pharmacist was supported by clinical information systems that enabled proactive identification of, and outreach to, eligible patients identified on the basis of poor BP control and either medication refill gaps or lack of recent medication intensification. The primary outcome is the relative change in systolic blood pressure (SBP) measurements over time. Secondary outcomes are changes in Hemoglobin A1c, low-density lipoprotein cholesterol (LDL), medication adherence determined from pharmacy refill data, and medication intensification rates. DISCUSSION: Integration of the three intervention elements--proactive identification, adherence counseling and medication intensification--is essential to achieve optimal levels of control for high-risk patients. Testing the effectiveness of this intervention at the team level allows us to study the program as it would typically be implemented within a clinic setting, including how it integrates with other elements of care. TRIAL REGISTRATION: The ClinicalTrials.gov registration number is NCT00495794.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Cryptococcus gattii causes life-threatening disease in otherwise healthy hosts and to a lesser extent in immunocompromised hosts. The highest incidence for this disease is on Vancouver Island, Canada, where an outbreak is expanding into neighboring regions including mainland British Columbia and the United States. This outbreak is caused predominantly by C. gattii molecular type VGII, specifically VGIIa/major. In addition, a novel genotype, VGIIc, has emerged in Oregon and is now a major source of illness in the region. Through molecular epidemiology and population analysis of MLST and VNTR markers, we show that the VGIIc group is clonal and hypothesize it arose recently. The VGIIa/IIc outbreak lineages are sexually fertile and studies support ongoing recombination in the global VGII population. This illustrates two hallmarks of emerging outbreaks: high clonality and the emergence of novel genotypes via recombination. In macrophage and murine infections, the novel VGIIc genotype and VGIIa/major isolates from the United States are highly virulent compared to similar non-outbreak VGIIa/major-related isolates. Combined MLST-VNTR analysis distinguishes clonal expansion of the VGIIa/major outbreak genotype from related but distinguishable less-virulent genotypes isolated from other geographic regions. Our evidence documents emerging hypervirulent genotypes in the United States that may expand further and provides insight into the possible molecular and geographic origins of the outbreak.