905 resultados para Nathec events, lightning, risk assessment, chemical plant


Relevância:

100.00% 100.00%

Publicador:

Resumo:

People, animals and the environment can be exposed to multiple chemicals at once from a variety of sources, but current risk assessment is usually carried out based on one chemical substance at a time. In human health risk assessment, ingestion of food is considered a major route of exposure to many contaminants, namely mycotoxins, a wide group of fungal secondary metabolites that are known to potentially cause toxicity and carcinogenic outcomes. Mycotoxins are commonly found in a variety of foods including those intended for consumption by infants and young children and have been found in processed cereal-based foods available in the Portuguese market. The use of mathematical models, including probabilistic approaches using Monte Carlo simulations, constitutes a prominent issue in human health risk assessment in general and in mycotoxins exposure assessment in particular. The present study aims to characterize, for the first time, the risk associated with the exposure of Portuguese children to single and multiple mycotoxins present in processed cereal-based foods (CBF). Portuguese children (0-3 years old) food consumption data (n=103) were collected using a 3 days food diary. Contamination data concerned the quantification of 12 mycotoxins (aflatoxins, ochratoxin A, fumonisins and trichothecenes) were evaluated in 20 CBF samples marketed in 2014 and 2015 in Lisbon; samples were analyzed by HPLC-FLD, LC-MS/MS and GC-MS. Daily exposure of children to mycotoxins was performed using deterministic and probabilistic approaches. Different strategies were used to treat the left censored data. For aflatoxins, as carcinogenic compounds, the margin of exposure (MoE) was calculated as a ratio of BMDL (benchmark dose lower confidence limit) to the aflatoxin exposure. The magnitude of the MoE gives an indication of the risk level. For the remaining mycotoxins, the output of exposure was compared to the dose reference values (TDI) in order to calculate the hazard quotients (ratio between exposure and a reference dose, HQ). For the cumulative risk assessment of multiple mycotoxins, the concentration addition (CA) concept was used. The combined margin of exposure (MoET) and the hazard index (HI) were calculated for aflatoxins and the remaining mycotoxins, respectively. 71% of CBF analyzed samples were contaminated with mycotoxins (with values below the legal limits) and approximately 56% of the studied children consumed CBF at least once in these 3 days. Preliminary results showed that children exposure to single mycotoxins present in CBF were below the TDI. Aflatoxins MoE and MoET revealed a reduced potential risk by exposure through consumption of CBF (with values around 10000 or more). HQ and HI values for the remaining mycotoxins were below 1. Children are a particularly vulnerable population group to food contaminants and the present results point out an urgent need to establish legal limits and control strategies regarding the presence of multiple mycotoxins in children foods in order to protect their health. The development of packaging materials with antifungal properties is a possible solution to control the growth of moulds and consequently to reduce mycotoxin production, contributing to guarantee the quality and safety of foods intended for children consumption.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The sediments of Bear Creek near Baltimore, Maryland demonstrate substantial toxicity to benthic organisms, and contain a complex mixture of organic and inorganic contaminants. The present study maps the spatial extent and depth profile of toxicity and contamination in Bear Creek, and explores correlations between heavy metals, organic contaminants, and toxic responses. Two novel analytical techniques – handheld XRF and an antibody-based PAH biosensor – were applied to samples from the site to quantify total metals and total PAHs in sediments. By comprehensively assessing toxicity in Bear Creek, the present study provides data to inform future risk assessments and management decisions relating for the site, while demonstrating the benefits of applying joint biological assays and chemical assessment methods to sediments with complex contaminant mixtures.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: Depression is a major health problem worldwide and the majority of patients presenting with depressive symptoms are managed in primary care. Current approaches for assessing depressive symptoms in primary care are not accurate in predicting future clinical outcomes, which may potentially lead to over or under treatment. The Allostatic Load (AL) theory suggests that by measuring multi-system biomarker levels as a proxy of measuring multi-system physiological dysregulation, it is possible to identify individuals at risk of having adverse health outcomes at a prodromal stage. Allostatic Index (AI) score, calculated by applying statistical formulations to different multi-system biomarkers, have been associated with depressive symptoms. Aims and Objectives: To test the hypothesis, that a combination of allostatic load (AL) biomarkers will form a predictive algorithm in defining clinically meaningful outcomes in a population of patients presenting with depressive symptoms. The key objectives were: 1. To explore the relationship between various allostatic load biomarkers and prevalence of depressive symptoms in patients, especially in patients diagnosed with three common cardiometabolic diseases (Coronary Heart Disease (CHD), Diabetes and Stroke). 2 To explore whether allostatic load biomarkers predict clinical outcomes in patients with depressive symptoms, especially in patients with three common cardiometabolic diseases (CHD, Diabetes and Stroke). 3 To develop a predictive tool to identify individuals with depressive symptoms at highest risk of adverse clinical outcomes. Methods: Datasets used: ‘DepChron’ was a dataset of 35,537 patients with existing cardiometabolic disease collected as a part of routine clinical practice. ‘Psobid’ was a research data source containing health related information from 666 participants recruited from the general population. The clinical outcomes for 3 both datasets were studied using electronic data linkage to hospital and mortality health records, undertaken by Information Services Division, Scotland. Cross-sectional associations between allostatic load biomarkers calculated at baseline, with clinical severity of depression assessed by a symptom score, were assessed using logistic and linear regression models in both datasets. Cox’s proportional hazards survival analysis models were used to assess the relationship of allostatic load biomarkers at baseline and the risk of adverse physical health outcomes at follow-up, in patients with depressive symptoms. The possibility of interaction between depressive symptoms and allostatic load biomarkers in risk prediction of adverse clinical outcomes was studied using the analysis of variance (ANOVA) test. Finally, the value of constructing a risk scoring scale using patient demographics and allostatic load biomarkers for predicting adverse outcomes in depressed patients was investigated using clinical risk prediction modelling and Area Under Curve (AUC) statistics. Key Results: Literature Review Findings. The literature review showed that twelve blood based peripheral biomarkers were statistically significant in predicting six different clinical outcomes in participants with depressive symptoms. Outcomes related to both mental health (depressive symptoms) and physical health were statistically associated with pre-treatment levels of peripheral biomarkers; however only two studies investigated outcomes related to physical health. Cross-sectional Analysis Findings: In DepChron, dysregulation of individual allostatic biomarkers (mainly cardiometabolic) were found to have a non-linear association with increased probability of co-morbid depressive symptoms (as assessed by Hospital Anxiety and Depression Score HADS-D≥8). A composite AI score constructed using five biomarkers did not lead to any improvement in the observed strength of the association. In Psobid, BMI was found to have a significant cross-sectional association with the probability of depressive symptoms (assessed by General Health Questionnaire GHQ-28≥5). BMI, triglycerides, highly sensitive C - reactive 4 protein (CRP) and High Density Lipoprotein-HDL cholesterol were found to have a significant cross-sectional relationship with the continuous measure of GHQ-28. A composite AI score constructed using 12 biomarkers did not show a significant association with depressive symptoms among Psobid participants. Longitudinal Analysis Findings: In DepChron, three clinical outcomes were studied over four years: all-cause death, all-cause hospital admissions and composite major adverse cardiovascular outcome-MACE (cardiovascular death or admission due to MI/stroke/HF). Presence of depressive symptoms and composite AI score calculated using mainly peripheral cardiometabolic biomarkers was found to have a significant association with all three clinical outcomes over the following four years in DepChron patients. There was no evidence of an interaction between AI score and presence of depressive symptoms in risk prediction of any of the three clinical outcomes. There was a statistically significant interaction noted between SBP and depressive symptoms in risk prediction of major adverse cardiovascular outcome, and also between HbA1c and depressive symptoms in risk prediction of all-cause mortality for patients with diabetes. In Psobid, depressive symptoms (assessed by GHQ-28≥5) did not have a statistically significant association with any of the four outcomes under study at seven years: all cause death, all cause hospital admission, MACE and incidence of new cancer. A composite AI score at baseline had a significant association with the risk of MACE at seven years, after adjusting for confounders. A continuous measure of IL-6 observed at baseline had a significant association with the risk of three clinical outcomes- all-cause mortality, all-cause hospital admissions and major adverse cardiovascular event. Raised total cholesterol at baseline was associated with lower risk of all-cause death at seven years while raised waist hip ratio- WHR at baseline was associated with higher risk of MACE at seven years among Psobid participants. There was no significant interaction between depressive symptoms and peripheral biomarkers (individual or combined) in risk prediction of any of the four clinical outcomes under consideration. Risk Scoring System Development: In the DepChron cohort, a scoring system was constructed based on eight baseline demographic and clinical variables to predict the risk of MACE over four years. The AUC value for the risk scoring system was modest at 56.7% (95% CI 55.6 to 57.5%). In Psobid, it was not possible to perform this analysis due to the low event rate observed for the clinical outcomes. Conclusion: Individual peripheral biomarkers were found to have a cross-sectional association with depressive symptoms both in patients with cardiometabolic disease and middle-aged participants recruited from the general population. AI score calculated with different statistical formulations was of no greater benefit in predicting concurrent depressive symptoms or clinical outcomes at follow-up, over and above its individual constituent biomarkers, in either patient cohort. SBP had a significant interaction with depressive symptoms in predicting cardiovascular events in patients with cardiometabolic disease; HbA1c had a significant interaction with depressive symptoms in predicting all-cause mortality in patients with diabetes. Peripheral biomarkers may have a role in predicting clinical outcomes in patients with depressive symptoms, especially for those with existing cardiometabolic disease, and this merits further investigation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This paper presents a harmonised framework of sediment quality assessment and dredging material characterisation for estuaries and port zones of North and South Atlantic. This framework, based on the weight-of-evidence approach, provides a structure and a process for conducting sediment/dredging material assessment that leads to a decision. The main structure consists of step 1 (examination of available data); step 2 (chemical characterisation and toxicity assessment); decision 1 (any chemical level higher than reference values? are sediments toxic?); step 3 (assessment of benthic community structure); step 4 (integration of the results); decision 2 (are sediments toxic or benthic community impaired?); step 5 (construction of the decision matrix) and decision 3 (is there environmental risk?). The sequence of assessments may be interrupted when the information obtained is judged to be sufficient for a correct characterisation of the risk posed by the sediments/dredging material. This framework brought novel features compared to other sediment/dredging material risk assessment frameworks: data integration through multivariate analysis allows the identification of which samples are toxic and/or related to impaired benthic communities; it also discriminates the chemicals responsible for negative biological effects; and the framework dispenses the use of a reference area. We demonstrated the successful application of this framework in different port and estuarine zones of the North (Gulf of Cadiz) and South Atlantic (Santos and Paranagua Estuarine Systems).

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The service of a critical infrastructure, such as a municipal wastewater treatment plant (MWWTP), is taken for granted until a flood or another low frequency, high consequence crisis brings its fragility to attention. The unique aspects of the MWWTP call for a method to quantify the flood stage-duration-frequency relationship. By developing a bivariate joint distribution model of flood stage and duration, this study adds a second dimension, time, into flood risk studies. A new parameter, inter-event time, is developed to further illustrate the effect of event separation on the frequency assessment. The method is tested on riverine, estuary and tidal sites in the Mid-Atlantic region. Equipment damage functions are characterized by linear and step damage models. The Expected Annual Damage (EAD) of the underground equipment is further estimated by the parametric joint distribution model, which is a function of both flood stage and duration, demonstrating the application of the bivariate model in risk assessment. Flood likelihood may alter due to climate change. A sensitivity analysis method is developed to assess future flood risk by estimating flood frequency under conditions of higher sea level and stream flow response to increased precipitation intensity. Scenarios based on steady and unsteady flow analysis are generated for current climate, future climate within this century, and future climate beyond this century, consistent with the WWTP planning horizons. The spatial extent of flood risk is visualized by inundation mapping and GIS-Assisted Risk Register (GARR). This research will help the stakeholders of the critical infrastructure be aware of the flood risk, vulnerability, and the inherent uncertainty.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Current procedures for flood risk estimation assume flood distributions are stationary over time, meaning annual maximum flood (AMF) series are not affected by climatic variation, land use/land cover (LULC) change, or management practices. Thus, changes in LULC and climate are generally not accounted for in policy and design related to flood risk/control, and historical flood events are deemed representative of future flood risk. These assumptions need to be re-evaluated, however, as climate change and anthropogenic activities have been observed to have large impacts on flood risk in many areas. In particular, understanding the effects of LULC change is essential to the study and understanding of global environmental change and the consequent hydrologic responses. The research presented herein provides possible causation for observed nonstationarity in AMF series with respect to changes in LULC, as well as a means to assess the degree to which future LULC change will impact flood risk. Four watersheds in the Midwest, Northeastern, and Central United States were studied to determine flood risk associated with historical and future projected LULC change. Historical single framed aerial images dating back to the mid-1950s were used along with Geographic Information Systems (GIS) and remote sensing models (SPRING and ERDAS) to create historical land use maps. The Forecasting Scenarios of Future Land Use Change (FORE-SCE) model was applied to generate future LULC maps annually from 2006 to 2100 for the conterminous U.S. based on the four IPCC-SRES future emission scenario conditions. These land use maps were input into previously calibrated Soil and Water Assessment Tool (SWAT) models for two case study watersheds. In order to isolate effects of LULC change, the only variable parameter was the Runoff Curve Number associated with the land use layer. All simulations were run with daily climate data from 1978-1999, consistent with the 'base' model which employed the 1992 NLCD to represent 'current' conditions. Output daily maximum flows were converted to instantaneous AMF series and were subsequently modeled using a Log-Pearson Type 3 (LP3) distribution to evaluate flood risk. Analysis of the progression of LULC change over the historic period and associated SWAT outputs revealed that AMF magnitudes tend to increase over time in response to increasing degrees of urbanization. This is consistent with positive trends in the AMF series identified in previous studies, although there are difficulties identifying correlations between LULC change and identified change points due to large time gaps in the generated historical LULC maps, mainly caused by unavailability of sufficient quality historic aerial imagery. Similarly, increases in the mean and median AMF magnitude were observed in response to future LULC change projections, with the tails of the distributions remaining reasonably constant. FORE-SCE scenario A2 was found to have the most dramatic impact on AMF series, consistent with more extreme projections of population growth, demands for growing energy sources, agricultural land, and urban expansion, while AMF outputs based on scenario B2 showed little changes for the future as the focus is on environmental conservation and regional solutions to environmental issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This is a redacted version of the the final thesis. Copyright material has been removed to comply with UK Copyright Law.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Clomazone (2-(2-chlorophenyl)methyl-4.4-dimethyl-3-isoxazolidinone) is a post emergence herbicide widely used in rice fields in Rio Grande do Sul (Brazil) with high activity against Gramineae at the recommended application rate(AR).of 700g/ha. The herbicide input into the aquatic ecosystem may occur by aerial application or water drainage. The presence of this chemical in the water may affect non-target organisms leading to impairments in the aquatic food chain. Studies were conducted in this work to evaluate the risk of Clomazone using the estimated mean affective concentration (EC50) for the microalgae Selenastrum capricornutum(96h), the duckweed Lemna valdiviana(96h) and the crustacean Daphnia similis(48h). The EC50 values were 11.2; 31.7 and 13.8 mg/l, respectively. According to the obtained data, and considering a direct input of the herbicide in a 10cm column water, the estimated maximum application rate that doesn't cause acute effects is 5.3 AR for S. capricornutum, 6.5 AR for D. similis and 15.0 AR for L. valdiviana. The estimated maximum application rate that doesn't cause chronic effects is 2.0 AR for D. similis, 1.6 AR for S. capricornutum and 4.5 AR for L. valviana.

Relevância:

100.00% 100.00%

Publicador:

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Queensland Department of Main Roads, Australia, spends approximately A$ 1 billion annually for road infrastructure asset management. To effectively manage road infrastructure, firstly road agencies not only need to optimise the expenditure for data collection, but at the same time, not jeopardise the reliability in using the optimised data to predict maintenance and rehabilitation costs. Secondly, road agencies need to accurately predict the deterioration rates of infrastructures to reflect local conditions so that the budget estimates could be accurately estimated. And finally, the prediction of budgets for maintenance and rehabilitation must provide a certain degree of reliability. This paper presents the results of case studies in using the probability-based method for an integrated approach (i.e. assessing optimal costs of pavement strength data collection; calibrating deterioration prediction models that suit local condition and assessing risk-adjusted budget estimates for road maintenance and rehabilitation for assessing life-cycle budget estimates). The probability concept is opening the path to having the means to predict life-cycle maintenance and rehabilitation budget estimates that have a known probability of success (e.g. produce budget estimates for a project life-cycle cost with 5% probability of exceeding). The paper also presents a conceptual decision-making framework in the form of risk mapping in which the life-cycle budget/cost investment could be considered in conjunction with social, environmental and political issues.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This report focuses on risk-assessment practices in the private rental market, with particular consideration of their impact on low-income renters. It is based on the fieldwork undertaken in the second stage of the research process that followed completion of the Positioning Paper. The key research question this study addressed was: What are the various factors included in ‘risk-assessments’ by real estate agents in allocating ‘affordable’ tenancies? How are these risks quantified and managed? What are the key outcomes of their decision-making? The study builds on previous research demonstrating that a relatively large proportion of low-cost private rental accommodation is occupied by moderate- to high-income households (Wulff and Yates 2001; Seelig 2001; Yates et al. 2004). This is occurring in an environment where the private rental sector is now the de facto main provider of rental housing for lower-income households across Australia (Seelig et al. 2005) and where a number of factors are implicated in patterns of ‘income–rent mismatching’. These include ongoing shifts in public housing assistance; issues concerning eligibility for rent assistance; ‘supply’ factors, such as loss of low-cost rental stock through upgrading and/or transfer to owner-occupied housing; patterns of supply and demand driven largely by middle- to high-income owner-investors and renters; and patterns of housing need among low-income households for whom affordable housing is not appropriate. In formulating a way of approaching the analysis of ‘risk-assessment’ in rental housing management, this study has applied three sociological perspectives on risk: Beck’s (1992) formulation of risk society as entailing processes of ‘individualisation’; a socio-cultural perspective which emphasises the situated nature of perceptions of risk; and a perspective which has drawn attention to different modes of institutional governance of subjects, as ‘carriers of specific indicators of risk’. The private rental market was viewed as a social institution, and the research strategy was informed by ‘institutional ethnography’ as a method of enquiry. The study was based on interviews with property managers, real estate industry representatives, tenant advocates and community housing providers. The primary focus of inquiry was on ‘the moment of allocation’. Six local areas across metropolitan and regional Queensland, New South Wales, and South Australia were selected as case study localities. In terms of the main findings, it is evident that access to private rental housing is not just a matter of ‘supply and demand’. It is also about assessment of risk among applicants. Risk – perceived or actual – is thus a critical factor in deciding who gets housed, and how. Risk and its assessment matter in the context of housing provision and in the development of policy responses. The outcomes from this study also highlight a number of salient points: 1.There are two principal forms of risk associated with property management: financial risk and risk of litigation. 2. Certain tenant characteristics and/or circumstances – ability to pay and ability to care for the rented property – are the main factors focused on in assessing risk among applicants for rental housing. Signals of either ‘(in)ability to pay’ and/or ‘(in)ability to care for the property’ are almost always interpreted as markers of high levels of risk. 3. The processing of tenancy applications entails a complex and variable mix of formal and informal strategies of risk-assessment and allocation where sorting (out), ranking, discriminating and handing over characterise the process. 4. In the eyes of property managers, ‘suitable’ tenants can be conceptualised as those who are resourceful, reputable, competent, strategic and presentable. 5. Property managers clearly articulated concern about risks entailed in a number of characteristics or situations. Being on a low income was the principal and overarching factor which agents considered. Others included: - unemployment - ‘big’ families; sole parent families - domestic violence - marital breakdown - shift from home ownership to private rental - Aboriginality and specific ethnicities - physical incapacity - aspects of ‘presentation’. The financial vulnerability of applicants in these groups can be invoked, alongside expressed concerns about compromised capacities to manage income and/or ‘care for’ the property, as legitimate grounds for rejection or a lower ranking. 6. At the level of face-to-face interaction between the property manager and applicants, more intuitive assessments of risk based upon past experience or ‘gut feelings’ come into play. These judgements are interwoven with more systematic procedures of tenant selection. The findings suggest that considerable ‘risk’ is associated with low-income status, either directly or insofar as it is associated with other forms of perceived risk, and that such risks are likely to impede access to the professionally managed private rental market. Detailed analysis suggests that opportunities for access to housing by low-income householders also arise where, for example: - the ‘local experience’ of an agency and/or property manager works in favour of particular applicants - applicants can demonstrate available social support and financial guarantors - an applicant’s preference or need for longer-term rental is seen to provide a level of financial security for the landlord - applicants are prepared to agree to specific, more stringent conditions for inspection of properties and review of contracts - the particular circumstances and motivations of landlords lead them to consider a wider range of applicants - In particular circumstances, property managers are prepared to give special consideration to applicants who appear worthy, albeit ‘risky’. The strategic actions of demonstrating and documenting on the part of vulnerable (low-income) tenant applicants can improve their chances of being perceived as resourceful, capable and ‘savvy’. Such actions are significant because they help to persuade property managers not only that the applicant may have sufficient resources (personal and material) but that they accept that the onus is on themselves to show they are reputable, and that they have valued ‘competencies’ and understand ‘how the system works’. The parameters of the market do shape the processes of risk-assessment and, ultimately, the strategic relation of power between property manager and the tenant applicant. Low vacancy rates and limited supply of lower-cost rental stock, in all areas, mean that there are many more tenant applicants than available properties, creating a highly competitive environment for applicants. The fundamental problem of supply is an aspect of the market that severely limits the chances of access to appropriate and affordable housing for low-income rental housing applicants. There is recognition of the impact of this problem of supply. The study indicates three main directions for future focus in policy and program development: providing appropriate supports to tenants to access and sustain private rental housing, addressing issues of discrimination and privacy arising in the processes of selecting suitable tenants, and addressing problems of supply.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Crash risk is the statistical probability of a crash. Its assessment can be performed through ex post statistical analysis or in real-time with on-vehicle systems. These systems can be cooperative. Cooperative Vehicle-Infrastructure Systems (CVIS) are a developing research avenue in the automotive industry worldwide. This paper provides a survey of existing CVIS systems and methods to assess crash risk with them. It describes the advantages of cooperative systems versus non-cooperative systems. A sample of cooperative crash risk assessment systems is analysed to extract vulnerabilities according to three criteria: market penetration, over-reliance on GPS and broadcasting issues. It shows that cooperative risk assessment systems are still in their infancy and requires further development to provide their full benefits to road users.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A considerable proportion of convicted sex offenders maintain a stance of innocence and thus do not engage in recommended treatment programs. As a result, such offenders are often deemed to have outstanding criminogenic needs which may negatively impact upon risk assessment procedures and parole eligibility. This paper reports on a study that aimed to investigate a group of forensic psychologists’ attitudes regarding the impact of denial on risk assessment ratings as well as parole eligibility. Participants completed a confidential open-ended questionnaire. Analysis indicated that considerable variability exists among forensic psychologists in regards to their beliefs about the origins of denial and what impact such denial should have on post-prison release eligibility. In contrast, there was less disparity regarding beliefs about the percentage of innocent yet incarcerated sex offenders. This paper also reviews current understanding regarding the impact of denial on recidivism as well as upon general forensic assessments.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The costs of work-related crashes In Australia and overseas, fleet safety or work-related road safety is an issue gaining increased attention from researchers, organisations, road safety practitioners and the general community. This attention is primarily in response to the substantial physical, emotional and economic costs associated with work-related road crashes. The increased risk factors and subsequent costs of work-related driving are also now well documented in the literature. For example, it is noteworthy that research has demonstrated that work-related drivers on average report a higher level of crash involvement compared to personal car drivers (Downs et al., 1999; Kweon and Kockelman, 2003) and in particular within Australia, road crashes are the most common form of work-related fatalities (Haworth et al., 2000).