7 resultados para 0.9 per mil were added

em DigitalCommons@The Texas Medical Center


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cardiovascular disease has been the leading cause of death in the United States for over fifty years. While multiple risk factors for cardiovascular disease have been identified, hypertension is one of the most commonly recognized and treatable. Recent studies indicate that the prevalence of hypertension among children and adolescents is between 3-5%, much higher than originally estimated and likely rising due to the epidemic of obesity in the U.S. In 2004, the National High Blood Pressure Education Program Working Group on High Blood Pressure in Children and Adolescents published new guidelines for the diagnosis and treatment of hypertension in this population. Included in these recommendations was the creation of a new diagnosis, pre-hypertension, aimed at identifying children at-risk for hypertension to provide early lifestyle interventions in an effort to prevent its ultimate development. In order to determine the risk associated with pre-hypertension for the development of incident HTN, a secondary analysis of a repeated cross-sectional study measuring blood pressure in Houston area adolescents from 2000 to 2007 was performed. Of 1006 students participating in the blood pressure screening on more than one occasion not diagnosed with hypertension at initial encounter, eleven were later found to have hypertension providing an overall incident rate of 0.5% per year. Incidence rates were higher among overweight adolescents–1.9% per year [IRR 8.6 (1.97, 51.63)]; students “at-risk for hypertension” (pre-hypertensive or initial blood pressure in the hypertensive range but falling on subsequent measures)–1.4% per year [IRR 4.77 (1.21, 19.78)]; and those with blood pressure ≥90th percentile on three occasions–6.6% per year [IRR 21.87 (3.40, 112.40)]. Students with pre-hypertension as currently defined by the Task Force did have an increased rate of hypertension (1.1% per year) but it did not reach statistical significance [IRR 2.44 (0.42, 10.18)]. Further research is needed to determine the morbidity and mortality associated with pre-hypertension in this age group as well as the effectiveness of various interventions for preventing the development of hypertensive disease among these at-risk individuals. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Reimbursement for dental services performed for children receiving Medicaid is reimbursed per service while dental treatment for military dependents provided at a military installation is neither directly reimbursable to those providing the care nor billed to those receiving the care. The purpose of this study was to compare pediatric dental services provided for a Medicaid population to a federally subsidized military facility to compare treatment choices and subsequent costs of care. It was hypothesized that differences in dental procedures for Medicaid and military dependent children would exist based upon treatment philosophy and payment method. A total of 240 records were reviewed for this study, consisting of 120 Medicaid patients at the University of Texas Health Science Center at San Antonio (UTHSCSA) and 120 military dependents at Wilford Hall Medical Center (WHMC), Lackland Air Force Base, San Antonio. Demographic data and treatment information were abstracted for children receiving dental treatment under general anesthesia between 2002 and 2006. Data was analyzed using the Wilcoxon rank sum test, Kruskal-Wallis test, and Fisher's exact test. The Medicaid recipients treated at UTHSCSA were younger than patients at WHMC (40.2 vs. 49.8 months, p<.001). The university also treated significantly more Hispanic children than WHMC (78.3% vs. 30.0%, p<.001). Children at UTHSCSA had a mean of 9.5 decayed teeth and were treated with 2.3 composite fillings, 0 amalgam fillings, 5.6 stainless steel crowns, 1.1 pulp therapies, 1.6 extractions, and 1.0 sealant. Children at WHMC had a mean of 8.7 decayed teeth and were treated with 1.4 composite fillings, 0.9 amalgam fillings, 5.6 stainless steel crowns, 1.7 pulp therapies, 0.9 extractions, and 2.1 sealants. The means of decayed teeth, total fillings, and stainless steel crowns were not statistically different. UTHSCSA provided more composite fillings (p<.001), fewer amalgam fillings (p<.001), fewer pulp therapies (p <.001), more extractions (p=.01), and fewer sealants (p<.001) when compared to WHMC. Age and gender did not effect decay rates, but those of Hispanic ethnicity did experience more decay than non-Hispanics (9.5 vs. 8.6, p=.02). Based upon Texas Medicaid reimbursement rates from 2006, the cost for dental treatment at both sites was approximately $650 per child. The results of this study do not support the hypothesis that Medicaid providers provide less conservative therapies, which would be more costly, care when compared to a military treatment center. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study was conducted in 4 villages in Bilbeis, Egypt, to document the infant feeding practices and identify their determinants, and examine the associations between feeding practices and diarrhea incidence in infants. A cohort of 152 infants were followed from birth with twice-weekly home visits to record feeding practices and diarrheal illness. Cross-sectional information was obtained about child birth; early neonatal feeding practices; and the socioeconomic, demographic, and water and sanitation characteristics of study families.^ Prelacteal fees were given to 60% of the infants. Nineteen percent of the infants were wet nursed at least once during the first week of life. Breast-feeding prevalence declined from 100% among infants aged less than 12 weeks to 84% among those aged 44-47 weeks. The prevalence of exclusive breast-feeding among breast-fed infants was 38% in those aged less than 4 weeks, increased to 54% in age period 4-7 weeks, and then declined rapidly to 4% in age period 24-27 weeks. The patterns and determinants of consumption by breast-fed infants of specific supplements were examined in detail.^ Between birth and age 47 weeks, the diarrhea incidence rate per person-year among breast-fed infants (6.84 episodes) was identical to the rate among all infants (6.89 episodes). In age period 0-11 weeks, the diarrhea incidence rate among breast-fed infants receiving supplements was 1.3 times (95% confidence interval: 0.9-2.0) higher than the rate among those exclusively breast-fed. In other age periods, diarrhea incidence was generally nonsignificantly higher among exclusively breast-fed infants than among those partially breast-fed and those completely weaned.^ Both univariate and multivariate analyses were done to examine the associations between diarrhea incidence and the consumption by breast-fed infants of specific supplements. After multivariate adjustment, supplements that showed significant, borderline, or suggestive positive associations with diarrhea incidence were cereal-water, cheese, raw vegetables, and 'other' foods. Significant, borderline, or suggestive negative associations were observed between diarrhea incidence and the intake of fresh animal milk, and potatoes.^ To reduce the risk of diarrhea, indiscriminate use of supplements among Bilbeis infants aged less than 12 weeks should be strongly discouraged. While mothers in this area should be educated about methods of safer preparation, handling, storage, and administration of all weaning foods, their attention should be particularly drawn to the 4 foods that were found to be positively associated with diarrhea incidence among infants in this study. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A 6-month-long, bench-scale simulation of an industrial wastewater stabilization pond (WSP) system was conducted to evaluate responses to several potential performance-enhancing treatments. The industrial WSP system consists of an anaerobic primary (1ry) WSP treating high-strength wastewater, followed by facultative secondary (2ry) and aerobic tertiary (3ry) WSPs in series treating lower-strength wastewater. The 1ry WSP was simulated with four glass aquaria which were fed with wastewater from the actual WSP system. The treatments examined were phosphorus supplementation (PHOS), phosphorus supplementation with pH control (PHOS+ALK), and phosphorus supplementation with pH control and effluent recycle (PHOS+ALK+RCY). The supplementary phosphorus treatment alone did not yield any significant change versus the CONTROL 1ry model pond. The average carbon to phosphorus ratio of the feed wastewater received from the WSP system was already 100:0.019 (i.e., 2,100 mg/l: 0.4 mg/l). The pH-control treatments (PHOS+ALK and PHOS+ALK+RCY) produced significant results, with 9 to 12 percent more total organic carbon (TOC) removal, 43 percent more volatile organic acid (VOA) generation, 78 percent more 2-ethoxyethanol and 14 percent more bis(2-chloroethyl)ether removal, and from 100- to 10,000-fold increases in bacterial enzyme activity and heterotrophic bacterial numbers. Recycling a 10-percent portion of the effluent yielded less variability for certain physicochemical parameters in the PHOS+ALK+RCY 1ry model pond, but overall there was no statistically-detectable improvement in performance versus no recycle. The 2ry and 3ry WSPs were also simulated in the laboratory to monitor the effect and fate of increased phosphorus loadings, as might occur if supplemental phosphorus were added to the 1ry WSP. Noticeable increases in algal growth were observed at feed phosphorus concentrations of 0.5 mg/l; however, there were no significant changes in the monitored physicochemical parameters. The effluent phosphorus concentrations from both the 2ry and 3ry model ponds did increase notably when feed phosphorus concentrations were increased from 0.5 to 1.0 mg/l. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of exercise electrocardiography (ECG) to detect latent coronary heart disease (CHD) is discouraged in apparently healthy populations because of low sensitivity. These recommendations however, are based on the efficacy of evaluation of ischemia (ST segment changes) with little regard for other measures of cardiac function that are available during exertion. The purpose of this investigation was to determine the association of maximal exercise hemodynamic responses with risk of mortality due to all-causes, cardiovascular disease (CVD), and coronary heart disease (CHD) in apparently healthy individuals. Study participants were 20,387 men (mean age = 42.2 years) and 6,234 women (mean age = 41.9 years) patients of a preventive medicine center in Dallas, TX examined between 1971 and 1989. During an average of 8.1 years of follow-up, there were 348 deaths in men and 66 deaths in women. In men, age-adjusted all-cause death rates (per 10,000 person years) across quartiles of maximal systolic blood pressure (SBP) (low to high) were: 18.2, 16.2, 23.8, and 24.6 (p for trend $<$0.001). Corresponding rates for maximal heart rate were: 28.9, 15.9, 18.4, and 15.1 (p trend $<$0.001). After adjustment for confounding variables including age, resting systolic pressure, serum cholesterol and glucose, body mass index, smoking status, physical fitness and family history of CVD, risks (and 95% confidence interval (CI)) of all-cause mortality for quartiles of maximal SBP, relative to the lowest quartile, were: 0.96 (0.70-1.33), 1.36 (1.01-1.85), and 1.37 (0.98-1.92) for quartiles 2-4 respectively. Similar risks for maximal heart rate were: 0.61 (0.44-0.85), 0.69 (0.51-0.93), and 0.60 (0.41-0.87). No associations were noted between maximal exercise rate-pressure product mortality. Similar results were seen for risk of CVD and CHD death. In women, similar trends in age-adjusted all-cause and CVD death rates across maximal SBP and heart rate categories were observed. Sensitivity of the exercise test in predicting mortality was enhanced when ECG results were evaluated together with maximal exercise SBP or heart rate with a concomitant decrease in specificity. Positive predictive values were not improved. The efficacy of the exercise test in predicting mortality in apparently healthy men and women was not enhanced by using maximal exercise hemodynamic responses. These results suggest that an exaggerated systolic blood pressure or an attenuated heart rate response to maximal exercise are risk factors for mortality in apparently healthy individuals. ^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Childhood obesity is a significant public health problem. Over 15 percent of children in the United States are obese, and about 25 percent of children in Texas are overweight (CDC NHANES). Furthermore, about 30 percent of elementary school aged children in Harris County, Texas are overweight or obese (Children at Risk Institute 2010). In addition to actions such as increasing physical activity, decreasing television watching and video game time, decreasing snacking on low nutrient calorie dense foods and sugar sweetened beverages, children need to consume more fruits and vegetables. According to the National Health and Nutrition Examination Survey (NHANES) from 2002, about 26 percent of U.S. children are meeting the recommendations for daily fruit intake and about 16 percent are meeting the recommendations for daily vegetable intake (CDC NHANES). In 2004, the average total intake of vegetables was 0.9 cups per day and 1.1 cups of fruit per day by children ages four to nine years old in the U.S. (CDC NHANES). Not only do children need effective nutrition education to learn about fruits and vegetables, they also need access and repeated exposure to fruits and vegetables (Anderson 2009, Briefel 2009). Nutrition education interventions that provide a structured, hands-on curriculum such as school gardens have produced significant changes in child fruit and vegetable intake (Blair 2009, McAleese 2007). To prevent childhood obesity from continuing into adolescence and adulthood, effective nutrition education interventions need to be implemented immediately and for the long-term. However, research has shown short-term nutrition education interventions such as summer camps to be effective for significant changes in child fruit and vegetable intake, preferences, and knowledge (Heim 2009). ^ A four week summer camp based on cooking and gardening was implemented at 6 Multi-Service centers in a large, urban city. The participants included children ranging in age from 7 to 14 years old (n=64). The purpose of the camp was to introduce children to their food from the seed to the plate through the utilization of gardening and culinary exercises. The summer camp activities were aimed at increasing the children's exposure, willingness to try, preferences, knowledge, and intake of fruits and vegetables. A survey was given on the first day of camp and again on the last day of camp that measured the pre- and post differences in knowledge, intake, willingness to try, and preferences of fruits and vegetables. The present study examined the short-term effectiveness of a cooking and garden-based nutrition education program on the knowledge, willingness, preferences, and intake among children aged 8 to 13 years old (n=40). The final sample of participants (n=40) was controlled for those who completed pre- and post-test surveys and who were in or above the third grade level. Results showed a statistically significant increase in the reported intake of vegetables and preferences for vegetables, specifically green beans, and fruits. There was also a significant increase in preferences for fruits among boys and participants ages 11 to 13 years. The results showed a change in the expected direction of willingness to try, preferences for vegetables, and intake of fruit, however these were not statistically significant. Interestingly, the results also showed a decrease in the intake of low nutrient calorie dense foods such as sweets and candy.^

Relevância:

100.00% 100.00%

Publicador:

Resumo:

OBJECTIVE. To determine the effectiveness of active surveillance cultures and associated infection control practices on the incidence of methicillin resistant Staphylococcus aureus (MRSA) in the acute care setting. DESIGN. A historical analysis of existing clinical data utilizing an interrupted time series design. ^ SETTING AND PARTICIPANTS. Patients admitted to a 260-bed tertiary care facility in Houston, TX between January 2005 through December 2010. ^ INTERVENTION. Infection control practices, including enhanced barrier precautions, compulsive hand hygiene, disinfection and environmental cleaning, and executive ownership and education, were simultaneously introduced during a 5-month intervention implementation period culminating with the implementation of active surveillance screening. Beginning June 2007, all high risk patients were cultured for MRSA nasal carriage within 48 hours of admission. Segmented Poisson regression was used to test the significance of the difference in incidence of healthcare-associated MRSA during the 29-month pre-intervention period compared to the 43-month post-intervention period. ^ RESULTS. A total of 9,957 of 11,095 high-risk patients (89.7%) were screened for MRSA carriage during the intervention period. Active surveillance cultures identified 1,330 MRSA-positive patients (13.4%) contributing to an admission prevalence of 17.5% in high-risk patients. The mean rate of healthcare-associated MRSA infection and colonization decreased from 1.1 per 1,000 patient-days in the pre-intervention period to 0.36 per 1,000 patient-days in the post-intervention period (P<0.001). The effect of the intervention in association with the percentage of S. aureus isolates susceptible to oxicillin were shown to be statistically significantly associated with the incidence of MRSA infection and colonization (IRR = 0.50, 95% CI = 0.31-0.80 and IRR = 0.004, 95% CI = 0.00003-0.40, respectively). ^ CONCLUSIONS. It can be concluded that aggressively targeting patients at high risk for colonization of MRSA with active surveillance cultures and associated infection control practices as part of a multifaceted, hospital-wide intervention is effective in reducing the incidence of healthcare-associated MRSA.^