865 resultados para Risk of flood
Resumo:
This essay examines the possibilities for practices that appeal to the primitive in the contemporary cultural context. The idea of the primitive is driven by a desire to challenge the limitations of Western culture, while at the same time attracting the charge of promoting Eurocentrism. This essay investigates this double risk and how artists have sought to evade it, confound it, or accentuate it.
Resumo:
BACKGROUND Mosquito-borne diseases are climate sensitive and there has been increasing concern over the impact of climate change on future disease risk. This paper projected the potential future risk of Barmah Forest virus (BFV) disease under climate change scenarios in Queensland, Australia. METHODS/PRINCIPAL FINDINGS We obtained data on notified BFV cases, climate (maximum and minimum temperature and rainfall), socio-economic and tidal conditions for current period 2000-2008 for coastal regions in Queensland. Grid-data on future climate projections for 2025, 2050 and 2100 were also obtained. Logistic regression models were built to forecast the otential risk of BFV disease distribution under existing climatic, socio-economic and tidal conditions. The model was applied to estimate the potential geographic distribution of BFV outbreaks under climate change scenarios. The predictive model had good model accuracy, sensitivity and specificity. Maps on potential risk of future BFV disease indicated that disease would vary significantly across coastal regions in Queensland by 2100 due to marked differences in future rainfall and temperature projections. CONCLUSIONS/SIGNIFICANCE We conclude that the results of this study demonstrate that the future risk of BFV disease would vary across coastal regions in Queensland. These results may be helpful for public health decision making towards developing effective risk management strategies for BFV disease control and prevention programs in Queensland.
Resumo:
Purpose To evaluate the association between retinal nerve fibre layer (RNFL) thickness and diabetic peripheral neuropathy in people with type 2 diabetes, and specifically those at higher risk of foot ulceration. Methods RNFL thicknesses was measured globally and in four quadrants (temporal, superior, nasal and inferior) at 3.45 mm diameter around the optic nerve head using optical coherence tomography (OCT). Severity of neuropathy was assessed using the Neuropathy Disability Score (NDS). Eighty-two participants with type 2 diabetes were stratified according to NDS scores (0-10) as: none, mild, moderate, and severe neuropathy. A control group was additionally included (n=17). Individuals with NDS≥ 6 (moderate and severe neuropathy) have been shown to be at higher risk of foot ulceration. A linear regression model was used to determine the association between RNFL and severity of neuropathy. Age, disease duration and diabetic retinopathy levels were fitted in the models. Independent t-test was employed for comparison between controls and the group without neuropathy, as well as for comparison between groups with higher and lower risk of foot ulceration. Analysis of variance was used to compare across all NDS groups. Results RNFL thickness was significantly associated with NDS in the inferior quadrant (b= -1.46, p=0.03). RNFL thicknesses globally and in superior, temporal and nasal quadrants did not show significant associations with NDS (all p>0.51). These findings were independent of the effect of age, disease duration and retinopathy. RNFL was thinner for the group with NDS ≥ 6 in all quadrants but was significant only inferiorly (p<0.005). RNFL for control participants was not significantly different from the group with diabetes and no neuropathy (superior p=0.07, global and all other quadrants: p>0.23). Mean RNFL thickness was not significantly different between the four NDS groups globally and in all quadrants (p=0.08 for inferior, P>0.14 for all other comparisons). Conclusions Retinal nerve fibre layer thinning is associated with neuropathy in people with type 2 diabetes. This relationship is strongest in the inferior retina and in individuals at higher risk of foot ulceration.
Resumo:
Introduction Environmental and biological samples taken around Da Nang Air Base have shown elevated levels of dioxin over many years [1-3]. A pre-intervention knowledge, attitudes and practices (KAP) survey (2009), a risk reduction program (2010) and a post intervention KAP survey (2011) were undertaken in four wards surrounding Danang Airbase. A follow-up evaluation was undertaken in 2013. Methods A KAP survey was implemented among 400 randomly selected food handlers. Eleven indepth interviews and four focus group discussions were also undertaken. Results The knowledge of respondents remained positive and/or improved at 2.5 years follow-up. There were no significant differences in attitudes toward preventing dioxin exposure across surveys; most respondents were positive in all three surveys. An increase in households (69.5%) undertaking measures to prevent exposure was observed, which was higher than in the pre-intervention survey (39.6%) and post- intervention survey (60.4%) (χ2 = 95.6; p < 0.001). The proportion of respondents practicing appropriate preventive measures was also significantly improved. Conclusions Despite most of the intervention program’s activities ceasing in 2010, the risk reduction program has resulted in positive outcomes over the longer-term, with many knowledge and attitude measures remaining stable or imporving. Some KAP indicators decreased, but these KAP indicators were still significantly higher than the pre-intervention levels.
Resumo:
OBJECTIVES: Bottle-feeding has been suggested to increase the risk of pyloric stenosis (PS). However, large population-based studies are needed. We examined the effect of bottle-feeding during the first 4 months after birth, by using detailed data about the timing of first exposure to bottle-feeding and extensive confounder information. METHODS: We performed a large population-based cohort study based on the Danish National Birth Cohort, which provided information on infants and feeding practice. Information about surgery for PS was obtained from the Danish National Patient Register. The association between bottle-feeding and the risk of PS was evaluated by hazard ratios (HRs) estimated in a Cox regression model, adjusting for possible confounders. RESULTS: Among 70 148 singleton infants, 65 infants had surgery for PS, of which 29 were bottle-fed before PS diagnosis. The overall HR of PS for bottle-fed infants compared with not bottle-fed infants was 4.62 (95% confidence interval [CI]: 2.78–7.65). Among bottle-fed infants, risk increases were similar for infants both breast and bottle-fed (HR: 3.36 [95% CI: 1.60–7.03]), formerly breastfed (HR: 5.38 [95% CI: 2.88–10.06]), and never breastfed (HR: 6.32 [95% CI: 2.45–16.26]) (P = .76). The increased risk of PS among bottle-fed infants was observed even after 30 days since first exposure to bottle-feeding and did not vary with age at first exposure to bottle-feeding. CONCLUSIONS: Bottle-fed infants experienced a 4.6-fold higher risk of PS compared with infants who were not bottle-fed. The result adds to the evidence supporting the advantage of exclusive breastfeeding in the first months after birth.
Resumo:
Suspected nephrocarcinogenic effects of trichloroethene (TRI) in humans are attributed to metabolites derived from the glutathione transferase (GST) pathway. The influence of polymorphisms of GSTM1 and GSTT1 isoenzymes on the risk of renal cell cancer in subjects having been exposed to high levels of TRI over many years was investigated. GSTM1 and GSTT1 genotypes were determined by internal standard controlled polymerase chain reaction. Fourty-five cases with histologically verified renal cell cancer and a history of long-term occupational exposure to high concentrations of TRI were studied. A reference group consisted of 48 workers from the same geographical region with similar histories of occupational exposures to TRI but not suffering from any cancer. Among the 45 renal cell cancer patients, 27 carried at least one functional GSTM1 (GSTM1 +) and 18 at least one functional GSTT1 (GSTT1 +). Among the 48 reference workers, 17 were GSTM1 + and 31 were GSTT1 +. Odds ratios for renal cell cancer were 2.7 for GSTM1 + individuals (95% CI, 1.18-6.33; P < 0.02) and 4.2 for GSTT1 + individuals (95% CI, 1.16-14.91; P < 0.05), respectively. The data support the present concept of the nephrocarcinogenicity of TRI.
Resumo:
Background: Serosorting, the practice of seeking to engage in unprotected anal intercourse with partners of the same HIV status as oneself, has been increasing among men who have sex with men. However, the effectiveness of serosorting as a strategy to reduce HIV risk is unclear, especially since it depends on the frequency of HIV testing. Methods: We estimated the relative risk of HIV acquisition associated with serosorting compared with not serosorting by using a mathematical model, informed by detailed behavioral data from a highly studied cohort of gay men. Results: We demonstrate that serosorting is unlikely to be highly beneficial in many populations of men who have sex with men, especially where the prevalence of undiagnosed HIV infections is relatively high. We find that serosorting is only beneficial in reducing the relative risk of HIV transmission if the prevalence of undiagnosed HIV infections is less than ∼20% and ∼40%, in populations of high (70%) and low (20%) treatment rates, respectively, even though treatment reduces the absolute risk of HIV transmission. Serosorting can be expected to lead to increased risk of HIV acquisition in many settings. In settings with low HIV testing rates serosorting can more than double the risk of HIV acquisition. Conclusions: Therefore caution should be taken before endorsing the practice of serosorting. It is very important to continue promotion of frequent HIV testing and condom use, particularly among people at high risk.
Resumo:
Aim Low prevalence rates of malnutrition at 2.5% to 4% have previously been reported in two tertiary paediatric Australian hospitals. The current study is the first to measure the prevalence of malnutrition, obesity and nutritional risk of paediatric inpatients in multiple hospitals throughout Australia. Methods Malnutrition, obesity and nutritional risk prevalence were investigated in 832 and 570 paediatric inpatients, respectively, in eight tertiary paediatric hospitals and eight regional hospitals across Australia on a single day. Malnutrition and obesity prevalence was determined using z-scores and body mass index (BMI) percentiles. High nutritional risk was determined as a Paediatric Yorkhill Malnutrition Score of 2 or more. Results The prevalence rates of malnourished, wasted, stunted, overweight and obese paediatric patients were 15%, 13.8%, 11.9%, 8.8% and 9.9%, respectively. Patients who identified as Aboriginal and Torres Strait Islander were more likely to have lower height-for-age z-scores (P < 0.01); however, BMI and weight-for-age z-scores were not significantly different. Children who were younger, from regional hospitals or with a primary diagnosis of cardiac disease or cystic fibrosis had significantly lower anthropometric z-scores (P = 0.05). Forty-four per cent of patients were identified as at high nutritional risk and requiring further nutritional assessment. Conclusions The prevalence of malnutrition and nutritional risk of Australian paediatric inpatients on a given day was much higher when compared with the healthy population. In contrast, the proportion of overweight and obese patients was less.
Resumo:
Objectives: Few studies have assessed the risk and impact of lymphedema among women treated for endometrial cancer. We aimed to quantify cumulative incidence of, and risk factors for developing lymphedema following treatment for endometrial cancer and estimate absolute risk for individuals. Further, we report unmet needs for help with lymphedema-specific issues. Methods: Women treated for endometrial cancer (n = 1243) were followed-up 3–5 years after diagnosis; a subset of 643 completed a follow-up survey that asked about lymphedema and lymphedema-related support needs. We identified a diagnosis of secondary lymphedema from medical records or self-report. Multivariable logistic regression was used to evaluate risk factors and estimates. Results: Overall, 13% of women developed lymphedema. Risk varied markedly with the number of lymph nodes removed and, to a lesser extent, receipt of adjuvant radiation or chemotherapy treatment, and use of nonsteroidal anti-inflammatory drugs (pre-diagnosis). The absolute risk of developing lymphedema was > 50% for women with 15 + nodes removed and 2–3 additional risk factors, 30–41% for those with 15 + nodes removed plus 0–1 risk factors or 6–14 nodes removed plus 3 risk factors, but ≤ 8% for women with no nodes removed or 1–5 nodes but no additional risk factors. Over half (55%) of those who developed lymphedema reported unmet need(s), particularly with lymphedema-related costs and pain. Conclusion: Lymphedema is common; experienced by one in eight women following endometrial cancer. Women who have undergone lymphadenectomy have very high risks of lymphedema and should be informed how to self-monitor for symptoms. Affected women need greater levels of support.
Resumo:
Introduction Risk factor analyses for nosocomial infections (NIs) are complex. First, due to competing events for NI, the association between risk factors of NI as measured using hazard rates may not coincide with the association using cumulative probability (risk). Second, patients from the same intensive care unit (ICU) who share the same environmental exposure are likely to be more similar with regard to risk factors predisposing to a NI than patients from different ICUs. We aimed to develop an analytical approach to account for both features and to use it to evaluate associations between patient- and ICU-level characteristics with both rates of NI and competing risks and with the cumulative probability of infection. Methods We considered a multicenter database of 159 intensive care units containing 109,216 admissions (813,739 admission-days) from the Spanish HELICS-ENVIN ICU network. We analyzed the data using two models: an etiologic model (rate based) and a predictive model (risk based). In both models, random effects (shared frailties) were introduced to assess heterogeneity. Death and discharge without NI are treated as competing events for NI. Results There was a large heterogeneity across ICUs in NI hazard rates, which remained after accounting for multilevel risk factors, meaning that there are remaining unobserved ICU-specific factors that influence NI occurrence. Heterogeneity across ICUs in terms of cumulative probability of NI was even more pronounced. Several risk factors had markedly different associations in the rate-based and risk-based models. For some, the associations differed in magnitude. For example, high Acute Physiology and Chronic Health Evaluation II (APACHE II) scores were associated with modest increases in the rate of nosocomial bacteremia, but large increases in the risk. Others differed in sign, for example respiratory vs cardiovascular diagnostic categories were associated with a reduced rate of nosocomial bacteremia, but an increased risk. Conclusions A combination of competing risks and multilevel models is required to understand direct and indirect risk factors for NI and distinguish patient-level from ICU-level factors.