918 resultados para Risk reduction
Resumo:
Much of the literature on disparities in access to health care among children has focused on measuring absolute and relative differences experienced by race/ethnic groups and, to a lesser extent, socioeconomic groups. However, it is not clear from existing literature how disparities in access to care may have changed over time for children, especially following implementation of the State Children’s Health Insurance Program (SCHIP). The primary objective of this research was to determine if there has been a decrease in disparities in access to care for children across two socioeconomic groups and race/ethnicity groups after SCHIP implementation. Methods commonly used to measure ‘health inequalities’ were used to measure disparities in access to care including population-attributable risk (PAR) and the relative index of inequality (RII). Using these measures there is evidence of a substantial decrease in socioeconomic disparities in health insurance coverage and to a lesser extent in having a usual source of care since the SCHIP program began. There is also evidence of a considerable decrease in non-Hispanic Black disparities in access to care. However, there appears to be a slight increase in disparities in access to care among Hispanic compared to non-Hispanic White children. While there were great improvements in disparities in access to care with the introduction of the SCHIP program, continuing progress in disparities may depend on continuation of the SCHIP program or similar targeted health policy programs. ^
Resumo:
Prostate cancer (PrCa) is a leading cause of morbidity and mortality, yet the etiology remains uncertain. Meta-analyses show that PrCa risk is reduced by 16% in men with type 2 diabetes (T2D), but the mechanism is unknown. Recent genome-wide association studies and meta-analyses have found single nucleotide polymorphisms (SNPs) that consistently predict T2D risk. We evaluated associations of incident PrCa with 14 T2D SNPs in the Atherosclerosis Risk in Communities (ARIC) study. From 1987-2000, there were 397 incident PrCa cases ascertained from state or local cancer registries among 6,642 men (1,560 blacks and 5,082 whites) aged 45-64 years at baseline. Genotypes were determined by TaqMan assay. Cox proportional hazards models were used to assess the association between PrCa and increasing number of T2D risk-raising alleles for individual SNPs and for genetic risk scores (GRS) comprised of the number of T2D risk-raising alleles across SNPs. Two-way gene-gene interactions were evaluated with likelihood ratio tests. Using additive genetic models, the T2D risk-raising allele was associated with significantly reduced risk of PrCa for IGF2BP2 rs4402960 (hazard ratio [HR]=0.79; P=0.07 among blacks only), SLC2A2 rs5400 (race-adjusted HR=0.85; P=0.05) and UCP2 rs660339 (race-adjusted HR=0.84; P=0.02), but significantly increased risk of PrCa for CAPN10 rs3792267 (race-adjusted HR=1.20; P=0.05). No other SNPs were associated with PrCa using an additive genetic model. However, at least one copy of the T2D risk-raising allele for TCF7L2 rs7903146 was associated with reduced PrCa risk using a dominant genetic model (race-adjusted HR=0.79; P=0.03). These results imply that the T2D-PrCa association may be partly due to shared genetic variation, but these results should be verified since multiple tests were performed. When the combined, additive effects of these SNPs were tested using a GRS, there was nearly a 10% reduction in risk of PrCa per T2D risk-raising allele (race-adjusted HR=0.92; P=0.02). SNPs in IGF2BP2, KCNJ11 and SLC2A2 were also involved in multiple synergistic gene-gene interactions on a multiplicative scale. In conclusion, it appears that the T2D-PrCa association may be due, in part, to common genetic variation. Further knowledge of T2D gene-PrCa mechanisms may improve understanding of PrCa etiology and may inform PrCa prevention and treatment.^
Resumo:
Background. In the United States, the incidence of pancreatic cancer has increased; more than 37,000 new cases of pancreatic cancer were diagnosed in the year 2007. Overall, the five-year survival rate is about 5% and pancreatic cancer ranks the fourth leading cause of cancer-related mortality among men and women. Despite the observed progress in cancer diagnosis and treatment, pancreatic cancer remains an unresolved significant public health problem in the United States. Familial pancreatic cancer has been confirmed to be responsible for approximately 10% of pancreatic cancer cases. However, 90% are still without known inherited predisposition. Until now, the role of oral contraceptive pills (OCPs) and hormonal replacement therapy (HRT) among women with pancreatic cancer remain unclear. We examined the association of exogenous hormonal uses in US women with risk of pancreatic cancer. ^ Methods. This was an active hospital-based case-control study which is conducted at the department of gastrointestinal medical oncology in The University of Texas M.D. Anderson Cancer Center. Between January 2005 and December 2007, a total of 287 women with pathologically confirmed pancreatic cancer (cases) and 287 healthy women (controls) were included in this investigation. Both cases and controls were frequency matched by age and race. Information about the use of hormonal contraceptives and hormonal replacement therapy (HRT) preparations as well as information about several risk factors of pancreatic cancer were collected by personal interview. Univariate and multivariate analyses were performed in this study to analyze the data. ^ Results. We found a statistical significant protective effect for use of exogenous hormone preparations on pancreatic cancer development (adjusted odds ratio [AOR], 0.4; 95% confidence interval [CI], 0.2–0.8). In addition, a 40% reduction in pancreatic cancer risk was observed among women who ever used any of the contraceptive methods including oral contraceptive pills (AOR, 6; 95% CI, 0.4–0.9). ^ Conclusions. Consistent with previous studies, the use of exogenous hormone preparations including oral contraceptive pills may confers a protective effect for pancreatic cancer development. More studies are warranted to explore for the underlying mechanism of such protection.^
Resumo:
Objective. Congenital limb defects are common birth defects occurring in approximately 2-7/10,000 live births. Because congenital limb defects are pervasive throughout all populations, and the conditions profoundly affect quality of life, they represent a significant public health concern. Currently there is a paucity of etiologic information in the literature regarding congenital limb reduction defects which represents a major limitation in developing treatment strategies as well as identifying high risk pregnancies. ^ Additionally, despite the fact that the majority of congenital limb reduction defects are isolated, most previous studies have not separated them from those occurring as part of a known syndrome or with multiple additional congenital anomalies of unknown etiology. It stands to reason that factors responsible for multiple congenital anomalies that happen to include congenital limb reduction defects may be quite different from those factors leading to an isolated congenital limb reduction defect. ^ As a first step toward gaining etiologic understanding, this cross-sectional study was undertaken to determine the birth prevalence and obtain demographic information about non-syndromic (isolated) congenital limb reduction defects that occurred in Texas from 1999-2001. ^ Methods. The study population included all infants/fetuses with isolated congenital limb reduction defects born in Texas during 1999-2001; the comparison population was all infants who were born to mothers who were residents of Texas during the same period of time. The overall birth prevalence of limb reduction defects was determined and adjusted for ethnicity, gender, site of defect (upper limb versus lower limb), county of residence, maternal age and maternal education. ^ Results. In Texas, the overall birth prevalence of isolated CLRDs was 2.1/10,000 live births (1.5 and 0.6/10,000 live births for upper limb and lower limb, respectively). ^ The risk of isolated lower limb CLRDs in Texas was significantly lower in females when gender was examined individually (crude prevalence odds ratio of 0.57, 95% CI of 0.36-0.91) as well as in relation to all other variables used in the analysis (adjusted prevalence odds ratio of 0.58, 95% CI of 0.36-0.93). ^ Harris County (which includes the Houston metropolitan area) had significantly lower risks of all (upper limb and lower limb combined) isolated CLRDs when examined in relation to other counties in Texas, with a crude prevalence odds ratio of 0.4 (95% CI: 0.29-0.72) and an adjusted prevalence odds ratio of 0.50 (95% CI: 0.31-0.80). The risk of isolated upper limb CLRDs was significantly lower in Harris County (crude prevalence odds ratio of 0.45, CI of 0.26-0.76 and adjusted prevalence odds ratio of 0.49, CI of 0.28-0.84). This trend toward decreased risk in Harris County was not observed for isolated lower limb reduction defects (adjusted prevalence odds ratio of 0.50, 95% confidence interval: 0.22-1.12). ^ Conclusions. The birth prevalence of isolated congenital limb reduction defects in Texas is in the lower limits of the range of rates that have been reported by other authors for other states (Alabama, Arkansas, California, Georgia, Hawaii, Iowa, Maryland, Massachusetts, North Carolina, Oklahoma, Utah, Washington) and other countries (Argentina, Australia, Austria, Bolivia, Brazil, Canada, Chile, China, Colombia, Costa Rica, Croatia, Denmark, Ecuador, England, Finland, France, Germany, Hungary, Ireland, Israel, Italy, Lithuania, Mexico, Norway, Paraguay, Peru, Spain, Scotland, Sweden, Switzerland, Uruguay, and Venezuela). In Texas, the birth prevalence of isolated congenital lower limb reduction defects was greater for males than females, while the birth prevalence of isolated congenital upper limb reduction defects was not significantly different between males and females. The reduced rates of limb reduction defects in Harris County warrant further investigation. This study has provided an important first step toward gaining etiologic understanding in the study of isolated congenital limb reduction defects. ^
Resumo:
The relationship between degree of diastolic blood pressure (DBP) reduction and mortality was examined among hypertensives, ages 30-69, in the Hypertension Detection and Follow-up Program (HDFP). The HDFP was a multi-center community-based trial, which followed 10,940 hypertensive participants for five years. One-year survival was required for inclusion in this investigation since the one-year annual visit was the first occasion where change in blood pressure could be measured on all participants. During the subsequent four years of follow-up on 10,052 participants, 568 deaths occurred. For levels of change in DBP and for categories of variables related to mortality, the crude mortality rate was calculated. Time-dependent life tables were also calculated so as to utilize available blood pressure data over time. In addition, the Cox life table regression model, extended to take into account both time-constant and time-dependent covariates, was used to examine the relationship change in blood pressure over time and mortality.^ The results of the time-dependent life table and time-dependent Cox life table regression analyses supported the existence of a quadratic function which modeled the relationship between DBP reduction and mortality, even after adjusting for other risk factors. The minimum mortality hazard ratio, based on a particular model, occurred at a DBP reduction of 22.6 mm Hg (standard error = 10.6) in the whole population and 8.5 mm Hg (standard error = 4.6) in the baseline DBP stratum 90-104. After this reduction, there was a small increase in the risk of death. There was not evidence of the quadratic function after fitting the same model using systolic blood pressure. Methodologic issues involved in studying a particular degree of blood pressure reduction were considered. The confidence interval around the change corresponding to the minimum hazard ratio was wide and the obtained blood pressure level should not be interpreted as a goal for treatment. Blood pressure reduction was attributed, not only to pharmacologic therapy, but also to regression to the mean, and to other unknown factors unrelated to treatment. Therefore, the surprising results of this study do not provide direct implications for treatment, but strongly suggest replication in other populations. ^
Resumo:
Preventable Hospitalizations (PHs) are hospitalizations that can be avoided with appropriate and timely care in the ambulatory setting and hence are closely associated with primary care access in a community. Increased primary care availability and health insurance coverage may increase primary care access, and consequently may be significantly associated with risks and costs of PHs. Objective. To estimate the risk and cost of preventable hospitalizations (PHs); to determine the association of primary care availability and health insurance coverage with the risk and costs of PHs, first alone and then simultaneously; and finally, to estimate the impact of expansions in primary care availability and health insurance coverage on the burden of PHs among non-elderly adult residents of Harris County. Methods. The study population was residents of Harris County, age 18 to 64, who had at least one hospital discharge in a Texas hospital in 2008. The primary independent variables were availability of primary care physicians, availability of primary care safety net clinics and health insurance coverage. The primary dependent variables were PHs and associated hospitalization costs. The Texas Health Care Information Collection (THCIC) Inpatient Discharge data was used to obtain information on the number and costs of PHs in the study population. Risk of PHs in the study population, as well as average and total costs of PHs were calculated. Multivariable logistic regression models and two-step Heckman regression models with log-transformed costs were used to determine the association of primary care availability and health insurance coverage with the risk and costs of PHs respectively, while controlling for individual predisposing, enabling and need characteristics. Predicted PH risk and cost were used to calculate the predicted burden of PHs in the study population and the impact of expansions in primary care availability and health insurance coverage on the predicted burden. Results. In 2008, hospitalized non-elderly adults in Harris County had 11,313 PHs and a corresponding PH risk of 8.02%. Congestive heart failure was the most common PH. PHs imposed a total economic burden of $84 billion at an average of $7,449 per PH. Higher primary care safety net availability was significantly associated with the lower risk of PHs in the final risk model, but only in the uninsured. A unit increase in safety net availability led to a 23% decline in PH odds in the uninsured, compared to only a 4% decline in the insured. Higher primary care physician availability was associated with increased PH costs in the final cost model (β=0.0020; p<0.05). Lack of health insurance coverage increased the risk of PH, with the uninsured having 30% higher odds of PHs (OR=1.299; p<0.05), but reduced the cost of a PH by 7% (β=-0.0668; p<0.05). Expansions in primary care availability and health insurance coverage were associated with a reduction of about $1.6 million in PH burden at the highest level of expansion. Conclusions. Availability of primary care resources and health insurance coverage in hospitalized non-elderly adults in Harris County are significantly associated with the risk and costs of PHs. Expansions in these primary care access factors can be expected to produce significant reductions in the burden of PHs in Harris County.^
Resumo:
OBJECTIVE. To determine the effectiveness of active surveillance cultures and associated infection control practices on the incidence of methicillin resistant Staphylococcus aureus (MRSA) in the acute care setting. DESIGN. A historical analysis of existing clinical data utilizing an interrupted time series design. ^ SETTING AND PARTICIPANTS. Patients admitted to a 260-bed tertiary care facility in Houston, TX between January 2005 through December 2010. ^ INTERVENTION. Infection control practices, including enhanced barrier precautions, compulsive hand hygiene, disinfection and environmental cleaning, and executive ownership and education, were simultaneously introduced during a 5-month intervention implementation period culminating with the implementation of active surveillance screening. Beginning June 2007, all high risk patients were cultured for MRSA nasal carriage within 48 hours of admission. Segmented Poisson regression was used to test the significance of the difference in incidence of healthcare-associated MRSA during the 29-month pre-intervention period compared to the 43-month post-intervention period. ^ RESULTS. A total of 9,957 of 11,095 high-risk patients (89.7%) were screened for MRSA carriage during the intervention period. Active surveillance cultures identified 1,330 MRSA-positive patients (13.4%) contributing to an admission prevalence of 17.5% in high-risk patients. The mean rate of healthcare-associated MRSA infection and colonization decreased from 1.1 per 1,000 patient-days in the pre-intervention period to 0.36 per 1,000 patient-days in the post-intervention period (P<0.001). The effect of the intervention in association with the percentage of S. aureus isolates susceptible to oxicillin were shown to be statistically significantly associated with the incidence of MRSA infection and colonization (IRR = 0.50, 95% CI = 0.31-0.80 and IRR = 0.004, 95% CI = 0.00003-0.40, respectively). ^ CONCLUSIONS. It can be concluded that aggressively targeting patients at high risk for colonization of MRSA with active surveillance cultures and associated infection control practices as part of a multifaceted, hospital-wide intervention is effective in reducing the incidence of healthcare-associated MRSA.^
Resumo:
1. With the global increase in CO2 emissions, there is a pressing need for studies aimed at understanding the effects of ocean acidification on marine ecosystems. Several studies have reported that exposure to CO2 impairs chemosensory responses of juvenile coral reef fishes to predators. Moreover, one recent study pointed to impaired responses of reef fish to auditory cues that indicate risky locations. These studies suggest that altered behaviour following exposure to elevated CO2 is caused by a systemic effect at the neural level. 2. The goal of our experiment was to test whether juvenile damselfish Pomacentrus amboinensis exposed to different levels of CO2 would respond differently to a potential threat, the sight of a large novel coral reef fish, a spiny chromis, Acanthochromis polyancanthus, placed in a watertight bag. 3. Juvenile damselfish exposed to 440 (current day control), 550 or 700 µatm CO2 did not differ in their response to the chromis. However, fish exposed to 850 µatm showed reduced antipredator responses; they failed to show the same reduction in foraging, activity and area use in response to the chromis. Moreover, they moved closer to the chromis and lacked any bobbing behaviour typically displayed by juvenile damselfishes in threatening situations. 4. Our results are the first to suggest that response to visual cues of risk may be impaired by CO2 and provide strong evidence that the multi-sensory effects of CO2 may stem from systematic effects at the neural level.
Resumo:
Probabilistic climate data have become available for the first time through the UK Climate Projections 2009, so that the risk of tree growth change can be quantified. We assess the drought risk spatially and temporally using drought probabilities and tree species vulnerabilities across Britain. We assessed the drought impact on the potential yield class of three major tree species (Picea sitchensis, Pinus sylvestris, and Quercus robur) which presently cover around 59% (400,700 ha) of state-managed forests, across lowland and upland sites. Here we show that drought impacts result mostly in reduced tree growth over the next 80 years when using b1, a1b and a1fi IPCC emissions scenarios. We found a maximum reduction of 94% but also a maximum increase of 56% in potential stand yield class in the 2080s from the baseline climate (1961-1990). Furthermore, potential production over the national forest estate for all three species in the 2080s may decrease due to drought by 42% in the lowlands and 32% in the uplands in comparison to the baseline climate. Our results reveal that potential tree growth and forest production on the national forest estate in Britain is likely to reduce, and indicate where and when adaptation measures are required. Moreover, this paper demonstrates the value of probabilistic climate projections for an important economic and environmental sector.
Resumo:
Coastal communities around the world face increasing risk from flooding as a result of rising sea level, increasing storminess, and land subsidence. Salt marshes can act as natural buffer zones, providing protection from waves during storms. However, the effectiveness of marshes in protecting the coastline during extreme events when water levels and waves are highest is poorly understood. Here, we experimentally assess wave dissipation under storm surge conditions in a 300-m-long wave flume that contains a transplanted section of natural salt marsh. We find that the presence of marsh vegetation causes considerable wave attenuation, even when water levels and waves are high. From a comparison with experiments without vegetation, we estimate that up to 60% of observed wave reduction is attributed to vegetation. We also find that although waves progressively flatten and break vegetation stems and thereby reduce dissipation, the marsh substrate remained remarkably stable and resistant to surface erosion under all conditions.The effectiveness of storm wave dissipation and the resilience of tidal marshes even at extreme conditions suggest that salt marsh ecosystems can be a valuable component of coastal protection schemes.
Resumo:
Drought spells can impose severe impacts in most vulnerable farms. It is well known that uninsured exposure exacerbates income inequality in farming systems. However, high administrative costs of traditional insurance hinder small farmers? access to risk management tools. The existence of moral hazard and systemic risk prevents the implementation of traditional insurance programs to address drought risk in rural areas. Innovative technologies like satellite images are being used to derive vegetation index which are highly correlated with drought impacts. The implementation of this technology in agricultural insurance may help to overcome some of the limitations of traditional insurance. However, basis risk has been identified as one of the main problems that hinder the acceptance of index insurance. In this paper we focus on the analyses of basis risk under different contract options in the grazing lands of the Araucanía region. A vegetation index database is used to develop an actuarial insurance model and estimate risk premiums for moderate and severe drought coverage. Risk premium sharply increases with risk coverage. In contrast with previous findings in the literature, our results are not conclusive and show that lowering the coverage level does not necessarily imply a reduction in basis risk. Further analyses of the relation between contract design and basis risk is a promising area of research that may render an important social utility for most vulnerable farming systems.
Resumo:
The critical conditions for hydrogenembrittlement (HE) risk of highstrengthgalvanizedsteel (HSGS) wires and tendons exposed to alkaline concrete pore solutions have been evaluated by means of electrochemical and mechanical testing. There is a relationship between the hydrogenembrittlementrisk in HSGS and the length of hydrogen evolution process in alkalinemedia. The galvanizedsteel suffers anodic dissolution simultaneously to the hydrogen evolution which does not stop until the passivation process is completed. HSGS wires exposed to a very highalkalinemedia have showed HE risk with loss in mechanical properties only if long periods with hydrogen evolution process take place with a simultaneous intensive galvanized coating reduction.
Resumo:
The road to the automation of the agricultural processes passes through the safe operation of the autonomous vehicles. This requirement is a fact in ground mobile units, but it still has not well defined for the aerial robots (UAVs) mainly because the normative and legislation are quite diffuse or even inexistent. Therefore, to define a common and global policy is the challenge to tackle. This characterization has to be addressed from the field experience. Accordingly, this paper presents the work done in this direction, based on the analysis of the most common sources of hazards when using UAV's for agricultural tasks. The work, based on the ISO 31000 normative, has been carried out by applying a three-step structure that integrates the identification, assessment and reduction procedures. The present paper exposes how this method has been applied to analyze previous accidents and malfunctions during UAV operations in order to obtain real failure causes. It has allowed highlighting common risks and hazardous sources and proposing specific guards and safety measures for the agricultural context.
Resumo:
Society is frequently exposed to and threatened by dangerous phenomena in many parts of the world. Different types of such phenomena require specific actions for proper risk management, from the stages of hazard identification to those of mitigation (including monitoring and early-warning) and/or reduction. The understanding of both predisposing factors and triggering mechanisms of a given danger and the prediction of its evolution from the source to the overall affected zone are relevant issues that must be addressed to properly evaluate a given hazard.
Resumo:
Poverty increases children's exposure to stress, elevating their risk for developing patterns of heightened sympathetic and parasympathetic stress reactivity. Repeated patterns of high sympathetic activation and parasympathetic withdrawal place children at risk for anxiety disorders. This study evaluated whether providing social support to preschool-age children during mildly stressful situations helps reduce reactivity, and whether this effect partly depends on children's previously assessed baseline reactivity patterns. The Biological Sensitivity to Context (BSC) theory proposes that highly reactive children may be more sensitive than less reactive children to all environmental influences, including social support. In contrast, conventional physiological reactivity (CPR) theory contends that highly reactive children are more vulnerable to the impact of stress but are less receptive to the potential benefits present within their social environments. In this study, baseline autonomic reactivity patterns were measured. Children were then randomly assigned to a high-support or neutral control condition, and the effect of social support on autonomic response patterns was assessed. Results revealed an interaction between baseline reactivity profiles and experimental condition. Children with patterns of high-reactivity reaped more benefits from the social support in the experimental condition than did their less reactive peers. Highly reactive children experienced relatively less reactivity reduction in the neutral condition while experiencing relatively greater reactivity reduction in the support condition. Despite their demonstrated stability over time, reactivity patterns are also quite susceptible to change at this age; therefore understanding how social support ameliorates reactivity will further efforts to avert stable patterns of high-reactivity among children with high levels of stress, ultimately reducing risk for anxiety disorders.