942 resultados para active data-centric
Resumo:
The built environment is recognized as having an impact on health and physical activity. Ecological theories of physical activity suggest that enhancing access to places to be physically active may increase activity levels. Studies show that users of fitness facilities are more likely to be active than inactive and active people are more likely to report access to fitness facilities. The purpose of this study was to examine the ecologic relationship between density of fitness facilities and self-reported levels of physical activity in adults in selected Metropolitan Statistical Areas (MSAs) in the United States.^ The 2007 MSA Business Patterns and the 2007 Behavioral Risk Factor Surveillance System (BRFSS) were used to gather fitness facility and physical activity data for 141 MSAs in the United States. Pearson correlations were performed between fitness facility density (number of facilities/100,000 people) and six summary measures of physical activity prevalence. Regional analysis was done using the nine U.S. Standard Regions for Temperature and Precipitation. ^ Direct correlations between fitness facility density and the percent of those physically active (r=0.27, 95% CI 0.11, 0.42, p=0.0012), those meeting moderate-intensity activity guidelines, (r=0.23, 95% CI 0.07, 0.38, p=0.006), and those meeting vigorous-intensity activity guidelines (r=0.30, 95% CI 0.14, 0.44, p=0.003) were found. An inverse correlation was found between fitness facility density and the percent of people physically inactive (r=-0.45, 95% CI -0.57, -0.31), p<0.0001). Regional analysis showed the same trends across most regions.^ Access to fitness facilities, defined here as fitness facility density, is related to physical activity levels. Results suggest the potential importance of the influence of the built environment on physical activity behaviors. Public health officials and city planners should consider the possible positive effect that increasing the number of fitness facilities in communities would have on activity levels.^
Resumo:
Identifying accurate numbers of soldiers determined to be medically not ready after completing soldier readiness processing may help inform Army leadership about ongoing pressures on the military involved in long conflict with regular deployment. In Army soldiers screened using the SRP checklist for deployment, what is the prevalence of soldiers determined to be medically not ready? Study group. 15,289 soldiers screened at all 25 Army deployment platform sites with the eSRP checklist over a 4-month period (June 20, 2009 to October 20, 2009). The data included for analysis included age, rank, component, gender and final deployment medical readiness status from MEDPROS database. Methods.^ This information was compiled and univariate analysis using chi-square was conducted for each of the key variables by medical readiness status. Results. Descriptive epidemiology Of the total sample 1548 (9.7%) were female and 14319 (90.2%) were male. Enlisted soldiers made up 13,543 (88.6%) of the sample and officers 1,746 (11.4%). In the sample, 1533 (10.0%) were soldiers over the age of 40 and 13756 (90.0%) were age 18-40. Reserve, National Guard and Active Duty made up 1,931 (12.6%), 2,942 (19.2%) and 10,416 (68.1%) respectively. Univariate analysis. Overall 1226 (8.0%) of the soldiers screened were determined to be medically not ready for deployment. Biggest predictive factor was female gender OR (2.8; 2.57-3.28) p<0.001. Followed by enlisted rank OR (2.01; 1.60-2.53) p<0.001. Reserve component OR (1.33; 1.16-1.53) p<0.001 and Guard OR (0.37; 0.30-0.46) p<0.001. For age > 40 demonstrated OR (1.2; 1.09-1.50) p<0.003. Overall the results underscore there may be key demographic groups relating to medical readiness that can be targeted with programs and funding to improve overall military medical readiness.^
Resumo:
A life table methodology was developed which estimates the expected remaining Army service time and the expected remaining Army sick time by years of service for the United States Army population. A measure of illness impact was defined as the ratio of expected remaining Army sick time to the expected remaining Army service time. The variances of the resulting estimators were developed on the basis of current data. The theory of partial and complete competing risks was considered for each type of decrement (death, administrative separation, and medical separation) and for the causes of sick time.^ The methodology was applied to world-wide U.S. Army data for calendar year 1978. A total of 669,493 enlisted personnel and 97,704 officers were reported on active duty as of 30 September 1978. During calendar year 1978, the Army Medical Department reported 114,647 inpatient discharges and 1,767,146 sick days. Although the methodology is completely general with respect to the definition of sick time, only sick time associated with an inpatient episode was considered in this study.^ Since the temporal measure was years of Army service, an age-adjusting process was applied to the life tables for comparative purposes. Analyses were conducted by rank (enlisted and officer), race and sex, and were based on the ratio of expected remaining Army sick time to expected remaining Army service time. Seventeen major diagnostic groups, classified by the Eighth Revision, International Classification of Diseases, Adapted for Use In The United States, were ranked according to their cumulative (across years of service) contribution to expected remaining sick time.^ The study results indicated that enlisted personnel tend to have more expected hospital-associated sick time relative to their expected Army service time than officers. Non-white officers generally have more expected sick time relative to their expected Army service time than white officers. This racial differential was not supported within the enlisted population. Females tend to have more expected sick time relative to their expected Army service time than males. This tendency remained after diagnostic groups 580-629 (Genitourinary System) and 630-678 (Pregnancy and Childbirth) were removed. Problems associated with the circulatory system, digestive system and musculoskeletal system were among the three leading causes of cumulative sick time across years of service. ^
Resumo:
OBJECTIVE. To determine the effectiveness of active surveillance cultures and associated infection control practices on the incidence of methicillin resistant Staphylococcus aureus (MRSA) in the acute care setting. DESIGN. A historical analysis of existing clinical data utilizing an interrupted time series design. ^ SETTING AND PARTICIPANTS. Patients admitted to a 260-bed tertiary care facility in Houston, TX between January 2005 through December 2010. ^ INTERVENTION. Infection control practices, including enhanced barrier precautions, compulsive hand hygiene, disinfection and environmental cleaning, and executive ownership and education, were simultaneously introduced during a 5-month intervention implementation period culminating with the implementation of active surveillance screening. Beginning June 2007, all high risk patients were cultured for MRSA nasal carriage within 48 hours of admission. Segmented Poisson regression was used to test the significance of the difference in incidence of healthcare-associated MRSA during the 29-month pre-intervention period compared to the 43-month post-intervention period. ^ RESULTS. A total of 9,957 of 11,095 high-risk patients (89.7%) were screened for MRSA carriage during the intervention period. Active surveillance cultures identified 1,330 MRSA-positive patients (13.4%) contributing to an admission prevalence of 17.5% in high-risk patients. The mean rate of healthcare-associated MRSA infection and colonization decreased from 1.1 per 1,000 patient-days in the pre-intervention period to 0.36 per 1,000 patient-days in the post-intervention period (P<0.001). The effect of the intervention in association with the percentage of S. aureus isolates susceptible to oxicillin were shown to be statistically significantly associated with the incidence of MRSA infection and colonization (IRR = 0.50, 95% CI = 0.31-0.80 and IRR = 0.004, 95% CI = 0.00003-0.40, respectively). ^ CONCLUSIONS. It can be concluded that aggressively targeting patients at high risk for colonization of MRSA with active surveillance cultures and associated infection control practices as part of a multifaceted, hospital-wide intervention is effective in reducing the incidence of healthcare-associated MRSA.^
Photosynthetically active radiation (PAR) measurements, SOIREE cruise track 1999-02-08 to 1999-02-28
Resumo:
Active fluorescence (fast repetition rate fluorometry, FRRF) was used to follow the photosynthetic response of the phytoplankton community during the 13-day Southern Ocean Iron RElease Experiment (SOIREE). This in situ iron enrichment was conducted in the polar waters of the Australasian-Pacific sector of the Southern Ocean in February 1999. Iron fertilisation of these high nitrate low chlorophyll (HNLC) waters resulted in an increase in the photosynthetic competence (Fv/Fm) of the resident cells from around 0.20 to greater than 0.60 (i.e. close to the theoretical maximum) by 10/11 days after the first enrichment. Although a significant iron-mediated response in Fv/Fm was detected as early as 24 h after the initial fertilisation, the increase in Fv/Fm to double ambient levels took 6 days. This response was five-fold slower than observed in iron enrichments (in situ and in vitro) in the HNLC waters of the subarctic and equatorial Pacific. Although little is known about the relationship between water temperature and Fv/Fm, it is likely that low water temperatures - and possibly the deep mixed layer - were responsible for this slow response time. During SOIREE, the photosynthetic competence of the resident phytoplankton in iron-enriched waters increased at dissolved iron levels above 0.2 nM, suggesting that iron limitation was alleviated at this concentration. Increases in Fv/Fm of cells within four algal size classes suggested that all taxa displayed a photosynthetic response to iron enrichment. Other physiological proxies of algal iron stress (such as flavodoxin levels in diatoms) exhibited different temporal trends to iron-enrichment than Fv/Fm during the time-course of SOIREE. The relationship between Fv/Fm, algal growth rate and such proxies in Southern Ocean waters is discussed.
Resumo:
The early last glacial termination was characterized by intense North Atlantic cooling and weak overturning circulation. This interval between ~18,000 and 14,600 years ago, known as Heinrich Stadial 1, was accompanied by a disruption of global climate and has been suggested as a key factor for the termination. However, the response of interannual climate variability in the tropical Pacific (El Niño-Southern Oscillation) to Heinrich Stadial 1 is poorly understood. Here we use Sr/Ca in a fossil Tahiti coral to reconstruct tropical South Pacific sea surface temperature around 15,000 years ago at monthly resolution. Unlike today, interannual South Pacific sea surface temperature variability at typical El Niño-Southern Oscillation periods was pronounced at Tahiti. Our results indicate that the El Niño-Southern Oscillation was active during Heinrich Stadial 1, consistent with climate model simulations of enhanced El Niño-Southern Oscillation variability at that time. Furthermore, a greater El Niño-Southern Oscillation influence in the South Pacific during Heinrich Stadial 1 is suggested, resulting from a southward expansion or shift of El Niño-Southern Oscillation sea surface temperature anomalies.