938 resultados para United States. Drug Enforcement Administration.
Resumo:
Studies suggest that income replacement is low for many workers with serious occupational injuries and illnesses. This review discusses three areas that hold promise for raising benefits to workers while reducing workers' compensation costs to employers: improving safety, containing medical costs, and reducing litigation. In theory, workers' compensation increases the costs to employers of injuries and so provides incentives to improve safety. Yet, taken as a whole, research does not provide convincing evidence that workers' compensation reduces injury rates. Moreover, unlike safety and health regulation, workers' compensation focuses the attention of employers on individual workers. High costs may lead employers to discourage claims and litigate when claims are filed. Controlling medical costs can reduce workers' compensation costs. Most studies, however, have focused on costs and have not addressed the effectiveness of medical care or patient satisfaction. Research also has shown that workers' compensation systems can reduce the need for litigation. Without litigation, benefits can be delivered more quickly and at lower costs.
Resumo:
Knowing one's HIV status is particularly important in the setting of recent tuberculosis (TB) exposure. Blood tests for assessment of tuberculosis infection, such as the QuantiFERON Gold in-tube test (QFT; Cellestis Limited, Carnegie, Victoria, Australia), offer the possibility of simultaneous screening for TB and HIV with a single blood draw. We performed a cross-sectional analysis of all contacts to a highly infectious TB case in a large meatpacking factory. Twenty-two percent were foreign-born and 73% were black. Contacts were tested with both tuberculin skin testing (TST) and QFT. HIV testing was offered on an opt-out basis. Persons with TST >or=10 mm, positive QFT, and/or positive HIV test were offered latent TB treatment. Three hundred twenty-six contacts were screened: TST results were available for 266 people and an additional 24 reported a prior positive TST for a total of 290 persons with any TST result (89.0%). Adequate QFT specimens were obtained for 312 (95.7%) of persons. Thirty-two persons had QFT results but did not return for TST reading. Twenty-two percent met the criteria for latent TB infection. Eighty-eight percent accepted HIV testing. Two (0.7%) were HIV seropositive; both individuals were already aware of their HIV status, but one had stopped care a year previously. None of the HIV-seropositive persons had latent TB, but all were offered latent TB treatment per standard guidelines. This demonstrates that opt-out HIV testing combined with QFT in a large TB contact investigation was feasible and useful. HIV testing was also widely accepted. Pairing QFT with opt-out HIV testing should be strongly considered when possible.
Resumo:
The best wind sites in the United States are often located far from electricity demand centers and lack transmission access. Local sites that have lower quality wind resources but do not require as much power transmission capacity are an alternative to distant wind resources. In this paper, we explore the trade-offs between developing new wind generation at local sites and installing wind farms at remote sites. We first examine the general relationship between the high capital costs required for local wind development and the relatively lower capital costs required to install a wind farm capable of generating the same electrical output at a remote site,with the results representing the maximum amount an investor should be willing to pay for transmission access. We suggest that this analysis can be used as a first step in comparing potential wind resources to meet a state renewable portfolio standard (RPS). To illustrate, we compare the cost of local wind (∼50 km from the load) to the cost of distant wind requiring new transmission (∼550-750 km from the load) to meet the Illinois RPS. We find that local, lower capacity factor wind sites are the lowest cost option for meeting the Illinois RPS if new long distance transmission is required to access distant, higher capacity factor wind resources. If higher capacity wind sites can be connected to the existing grid at minimal cost, in many cases they will have lower costs.
Resumo:
The variability of summer precipitation in the southeastern United States is examined in this study using 60-yr (1948-2007) rainfall data. The Southeast summer rainfalls exhibited higher interannual variability with more intense summer droughts and anomalous wetness in the recent 30 years (1978-2007) than in the prior 30 years (1948-77). Such intensification of summer rainfall variability was consistent with a decrease of light (0.1-1 mm day-1) and medium (1-10 mm day-1) rainfall events during extremely dry summers and an increase of heavy (.10 mm day-1) rainfall events in extremely wet summers. Changes in rainfall variability were also accompanied by a southward shift of the region of maximum zonal wind variability at the jet stream level in the latter period. The covariability between the Southeast summer precipitation and sea surface temperatures (SSTs) is also analyzed using the singular value decomposition (SVD) method. It is shown that the increase of Southeast summer precipitation variability is primarily associated with a higher SST variability across the equatorial Atlantic and also SST warming in the Atlantic. © 2010 American Meteorological Society.
Variation in use of surveillance colonoscopy among colorectal cancer survivors in the United States.
Resumo:
BACKGROUND: Clinical practice guidelines recommend colonoscopies at regular intervals for colorectal cancer (CRC) survivors. Using data from a large, multi-regional, population-based cohort, we describe the rate of surveillance colonoscopy and its association with geographic, sociodemographic, clinical, and health services characteristics. METHODS: We studied CRC survivors enrolled in the Cancer Care Outcomes Research and Surveillance (CanCORS) study. Eligible survivors were diagnosed between 2003 and 2005, had curative surgery for CRC, and were alive without recurrences 14 months after surgery with curative intent. Data came from patient interviews and medical record abstraction. We used a multivariate logit model to identify predictors of colonoscopy use. RESULTS: Despite guidelines recommending surveillance, only 49% of the 1423 eligible survivors received a colonoscopy within 14 months after surgery. We observed large regional differences (38% to 57%) across regions. Survivors who received screening colonoscopy were more likely to: have colon cancer than rectal cancer (OR = 1.41, 95% CI: 1.05-1.90); have visited a primary care physician (OR = 1.44, 95% CI: 1.14-1.82); and received adjuvant chemotherapy (OR = 1.75, 95% CI: 1.27-2.41). Compared to survivors with no comorbidities, survivors with moderate or severe comorbidities were less likely to receive surveillance colonoscopy (OR = 0.69, 95% CI: 0.49-0.98 and OR = 0.44, 95% CI: 0.29-0.66, respectively). CONCLUSIONS: Despite guidelines, more than half of CRC survivors did not receive surveillance colonoscopy within 14 months of surgery, with substantial variation by site of care. The association of primary care visits and adjuvant chemotherapy use suggests that access to care following surgery affects cancer surveillance.
Resumo:
Cryptococcus gattii causes life-threatening disease in otherwise healthy hosts and to a lesser extent in immunocompromised hosts. The highest incidence for this disease is on Vancouver Island, Canada, where an outbreak is expanding into neighboring regions including mainland British Columbia and the United States. This outbreak is caused predominantly by C. gattii molecular type VGII, specifically VGIIa/major. In addition, a novel genotype, VGIIc, has emerged in Oregon and is now a major source of illness in the region. Through molecular epidemiology and population analysis of MLST and VNTR markers, we show that the VGIIc group is clonal and hypothesize it arose recently. The VGIIa/IIc outbreak lineages are sexually fertile and studies support ongoing recombination in the global VGII population. This illustrates two hallmarks of emerging outbreaks: high clonality and the emergence of novel genotypes via recombination. In macrophage and murine infections, the novel VGIIc genotype and VGIIa/major isolates from the United States are highly virulent compared to similar non-outbreak VGIIa/major-related isolates. Combined MLST-VNTR analysis distinguishes clonal expansion of the VGIIa/major outbreak genotype from related but distinguishable less-virulent genotypes isolated from other geographic regions. Our evidence documents emerging hypervirulent genotypes in the United States that may expand further and provides insight into the possible molecular and geographic origins of the outbreak.
Resumo:
The health of clergy is important, and clergy may find health programming tailored to them more effective. Little is known about existing clergy health programs. We contacted Protestant denominational headquarters and searched academic databases and the Internet. We identified 56 clergy health programs and categorized them into prevention and personal enrichment; counseling; marriage and family enrichment; peer support; congregational health; congregational effectiveness; denominational enrichment; insurance/strategic pension plans; and referral-based programs. Only 13 of the programs engaged in outcomes evaluation. Using the Socioecological Framework, we found that many programs support individual-level and institutional-level changes, but few programs support congregational-level changes. Outcome evaluation strategies and a central repository for information on clergy health programs are needed. © 2011 Springer Science+Business Media, LLC.
Resumo:
This dissertation project identifies important works for solo saxophone by United States composers between 1975 and 2005. The quality, variety, expressiveness, and difficulty of the solo saxophone repertoire during these thirty years is remarkable and remedies, to some extent, the fact that the saxophone had been a largely neglected instrument in the realm of classical music. In twentieth-century music, including Jazz, the saxophone developed, nevertheless, a unique and significant voice as is evident in the saxophone repertoire that expands immensely in many instrumental settings, including the orchestra, solo works, and a wide variety of chamber ensembles. Historically, the saxophone in the United States first found its niche in Vaudeville, military bands, and jazz ensembles, while in Europe composers such as Debussy, D'Indy, Schmitt, Ibert, Glazounov, Heiden, and Desenclos recognized the potential of the instrument and wrote for it. The saxophone is well suited to the intimacy and unique timbral explorations of the solo literature, but only by the middle twentieth century did the repertoire allow the instrument to flourish into a virtuosic and expressive voice presented by successive generations of performers – Marcel Mule, Sigurd Rascher, Cecil Leeson, Jean-Marie Londeix, Fred Hemke, Eugene Rousseau, and Donald Sinta. The very high artistic level of theses soloists was inspiring and dozens of new compositions were commissioned. Through the 1960’s American composers such as Paul Creston, Leslie Bassett, Henry Cowell, Alec Wilder, and others produced eminent works for the saxophone, to be followed by an enormous output of quality compositions between 1975 and 2005. The works chosen for performance were selected from thousands of compositions between 1975 and 2005 researched for this project. The three recital dates were: April 6, 2005, in Gildenhorn Recital Hall, December 4, 2005, in Ulrich Recital Hall, and April 15, 2006, in Gildenhorn Recital Hall. Recordings of these recitals may be obtained in person or online from the Michelle Smith Performing Arts Library of the University of Maryland, College Park.
Resumo:
The Archaeological Reconnaissance Survey of United States Naval Academy will provide the Navy with a rich understanding of the history of this property. A National Register of Historic Places District, such as the Academy, deserves a thorough analysis of its past, in order to preserve what exists and to plan for the future. The goal of this project is to investigate the history of the Academy through traditional historic research, innovative computer analysis of historic maps, oral history interviews, and tract histories. This information has been synthesized to provide the Navy with a planning tool for Public Works, a concise look at the cartographic history of the Academy, and reference manual of the vast amounts of information which have been gathered during the course of this project. This information can serve as a reference tool to help the Public Works department comply with Section 106 regulations of the Historic Sites Preservation Act, with regard to construction. It can also serve as a source of cartographic history for those interested in the Academy's physical development, and as a way of preserving the culture of residents in Annapolis. This program and archaeological survey will ultimately serve to add to the rich history of the United States Naval Academy while preserving an important part of our nation's heritage.
Resumo:
PURPOSE/BACKGROUND: Dynamic balance is an important component of motor skill development. Poor dynamic balance has previously been associated with sport related injury. However, the vast majority of dynamic balance studies as they relate to sport injury have occurred in developed North American or European countries. Thus, the purpose of this study was to compare dynamic balance in adolescent male soccer players from Rwanda to a matched group from the United States. METHODS: Twenty-six adolescent male soccer players from Rwanda and 26 age- and gender-matched control subjects from the United States were screened using the Lower Quarter Y Balance Test during their pre-participation physical. Reach asymmetry (cm) between limbs was examined for all reach directions. In addition, reach distance in each direction (normalized to limb length, %LL) and the composite reach score (also normalized to %LL) were examined. Dependent samples t-tests were performed with significant differences identified at p<0.05. RESULTS: Twenty-six male soccer players from Rwanda (R) were matched to twenty-six male soccer players from the United States (US). The Rwandan soccer players performed better in the anterior (R: 83.9 ± 3.2 %LL; US: 76.5 ± 6.6 %LL, p<0.01), posterolateral (R: 114.4 ± 8.3 %LL ; US: 106.5 ± 8.2 %LL, p<0.01) and composite (R: 105.6 ± 1.3 %LL; US: 97.8 ± 6.2 %LL, p<0.01) reach scores. No significant differences between groups were observed for reach asymmetry. CONCLUSIONS: Adolescent soccer players from Rwanda exhibit superior performance on a standardized dynamic balance test as comparison to similar athletes from the United States. The examination of movement abilities of athletes from countries of various origins may allow for a greater understanding of the range of true normative values for dynamic balance. LEVELS OF EVIDENCE: 3b.
Resumo:
Given the increases in spatial resolution and other improvements in climate modeling capabilities over the last decade since the CMIP3 simulations were completed, CMIP5 provides a unique opportunity to assess scientific understanding of climate variability and change over a range of historical and future conditions. With participation from over 20 modeling groups and more than 40 global models, CMIP5 represents the latest and most ambitious coordinated international climate model intercomparison exercise to date. Observations dating back to 1900 show that the temperatures in the twenty-first century have the largest spatial extent of record breaking and much above normal mean monthly maximum and minimum temperatures. The 20-yr return value of the annual maximum or minimum daily temperature is one measure of changes in rare temperature extremes.
Resumo:
This study investigates the changes of the North Atlantic subtropical high (NASH) and its impact on summer precipitation over the southeastern (SE) United States using the 850-hPa geopotential height field in the National Centers forEnvironmental Prediction (NCEP) reanalysis, the 40-yr European Centre for Medium-Range Weather Forecasts (ECMWF) Re-Analysis (ERA-40), long-term rainfall data, and Intergovernmental Panel on Climate Change (IPCC) Fourth Assessment Report (AR4) model simulations during the past six decades (1948-2007). The results show that the NASH in the last 30 yr has become more intense, and its western ridge has displaced westward with an enhanced meridional movement compared to the previous 30 yr. When the NASH moved closer to the continental United States in the three most recent decades, the effect of the NASH on the interannual variation of SE U.S. precipitation is enhanced through the ridge's north-south movement. The study's attribution analysis suggested that the changes of the NASH are mainly due to anthropogenic warming. In the twenty-first century with an increase of the atmospheric CO2 concentration, the center of the NASH would be intensified and the western ridge of the NASH would shift farther westward. These changes would increase the likelihood of both strong anomalously wet and dry summers over the SEUnited States in the future, as suggested by the IPCC AR4 models. © 2011 American Meteorological Society.
Resumo:
© 2014, Springer-Verlag Berlin Heidelberg.This study assesses the skill of advanced regional climate models (RCMs) in simulating southeastern United States (SE US) summer precipitation and explores the physical mechanisms responsible for the simulation skill at a process level. Analysis of the RCM output for the North American Regional Climate Change Assessment Program indicates that the RCM simulations of summer precipitation show the largest biases and a remarkable spread over the SE US compared to other regions in the contiguous US. The causes of such a spread are investigated by performing simulations using the Weather Research and Forecasting (WRF) model, a next-generation RCM developed by the US National Center for Atmospheric Research. The results show that the simulated biases in SE US summer precipitation are due mainly to the misrepresentation of the modeled North Atlantic subtropical high (NASH) western ridge. In the WRF simulations, the NASH western ridge shifts 7° northwestward when compared to that in the reanalysis ensemble, leading to a dry bias in the simulated summer precipitation according to the relationship between the NASH western ridge and summer precipitation over the southeast. Experiments utilizing the four dimensional data assimilation technique further suggest that the improved representation of the circulation patterns (i.e., wind fields) associated with the NASH western ridge substantially reduces the bias in the simulated SE US summer precipitation. Our analysis of circulation dynamics indicates that the NASH western ridge in the WRF simulations is significantly influenced by the simulated planetary boundary layer (PBL) processes over the Gulf of Mexico. Specifically, a decrease (increase) in the simulated PBL height tends to stabilize (destabilize) the lower troposphere over the Gulf of Mexico, and thus inhibits (favors) the onset and/or development of convection. Such changes in tropical convection induce a tropical–extratropical teleconnection pattern, which modulates the circulation along the NASH western ridge in the WRF simulations and contributes to the modeled precipitation biases over the SE US. In conclusion, our study demonstrates that the NASH western ridge is an important factor responsible for the RCM skill in simulating SE US summer precipitation. Furthermore, the improvements in the PBL parameterizations for the Gulf of Mexico might help advance RCM skill in representing the NASH western ridge circulation and summer precipitation over the SE US.
Resumo:
Approximately 45,000 individuals are hospitalized annually for burn treatment. Rehabilitation after hospitalization can offer a significant improvement in functional outcomes. Very little is known nationally about rehabilitation for burns, and practices may vary substantially depending on the region based on observed Medicare post-hospitalization spending amounts. This study was designed to measure variation in rehabilitation utilization by state of hospitalization for patients hospitalized with burn injury. This retrospective cohort study used nationally collected data over a 10-year period (2001 to 2010), from the Healthcare Cost and Utilization Project (HCUP) State Inpatient Databases (SIDs). Patients hospitalized for burn injury (n = 57,968) were identified by ICD-9-CM codes and were examined to see specifically if they were discharged immediately to inpatient rehabilitation after hospitalization (primary endpoint). Both unadjusted and adjusted likelihoods were calculated for each state taking into account the effects of age, insurance status, hospitalization at a burn center, and extent of burn injury by TBSA. The relative risk of discharge to inpatient rehabilitation varied by as much as 6-fold among different states. Higher TBSA, having health insurance, higher age, and burn center hospitalization all increased the likelihood of discharge to inpatient rehabilitation following acute care hospitalization. There was significant variation between states in inpatient rehabilitation utilization after adjusting for variables known to affect each outcome. Future efforts should be focused on identifying the cause of this state-to-state variation, its relationship to patient outcome, and standardizing treatment across the United States.
Resumo:
OBJECTIVE: To ascertain the degree of variation, by state of hospitalization, in outcomes associated with traumatic brain injury (TBI) in a pediatric population. DESIGN: A retrospective cohort study of pediatric patients admitted to a hospital with a TBI. SETTING: Hospitals from states in the United States that voluntarily participate in the Agency for Healthcare Research and Quality's Healthcare Cost and Utilization Project. PARTICIPANTS: Pediatric (age ≤ 19 y) patients hospitalized for TBI (N=71,476) in the United States during 2001, 2004, 2007, and 2010. INTERVENTIONS: None. MAIN OUTCOME MEASURES: Primary outcome was proportion of patients discharged to rehabilitation after an acute care hospitalization among alive discharges. The secondary outcome was inpatient mortality. RESULTS: The relative risk of discharge to inpatient rehabilitation varied by as much as 3-fold among the states, and the relative risk of inpatient mortality varied by as much as nearly 2-fold. In the United States, approximately 1981 patients could be discharged to inpatient rehabilitation care if the observed variation in outcomes was eliminated. CONCLUSIONS: There was significant variation between states in both rehabilitation discharge and inpatient mortality after adjusting for variables known to affect each outcome. Future efforts should be focused on identifying the cause of this state-to-state variation, its relationship to patient outcome, and standardizing treatment across the United States.