540 resultados para Risk-taking
Resumo:
Background On-site wastewater treatment system (OWTS) siting, design and management has traditionally been based on site specific conditions with little regard to the surrounding environment or the cumulative effect of other systems in the environment. The general approach has been to apply the same framework of standards and regulations to all sites equally, regardless of the sensitivity, or lack thereof, to the receiving environment. Consequently, this has led to the continuing poor performance and failure of on-site systems, resulting in environmental and public health consequences. As a result, there is increasing realisation that more scientifically robust evaluations in regard to site assessment and the underlying ground conditions are needed. Risk-based approaches to on-site system siting, design and management are considered the most appropriate means of improvement to the current standards and codes for on-site wastewater treatment systems. The Project Research in relation to this project was undertaken within the Gold Coast City Council region, the major focus being the semi-urban, rural residential and hinterland areas of the city that are not serviced by centralised treatment systems. The Gold Coast has over 15,000 on-site systems in use, with approximately 66% being common septic tank-subsurface dispersal systems. A recent study evaluating the performance of these systems within the Gold Coast area showed approximately 90% were not meeting the specified guidelines for effluent treatment and dispersal. The main focus of this research was to incorporate strong scientific knowledge into an integrated risk assessment process to allow suitable management practices to be set in place to mitigate the inherent risks. To achieve this, research was undertaken focusing on three main aspects involved with the performance and management of OWTS. Firstly, an investigation into the suitability of soil for providing appropriate effluent renovation was conducted. This involved detailed soil investigations, laboratory analysis and the use of multivariate statistical methods for analysing soil information. The outcomes of these investigations were developed into a framework for assessing soil suitability for effluent renovation. This formed the basis for the assessment of OWTS siting and design risks employed in the developed risk framework. Secondly, an assessment of the environmental and public health risks was performed specifically related the release of contaminants from OWTS. This involved detailed groundwater and surface water sampling and analysis to assess the current and potential risks of contamination throughout the Gold Coast region. Additionally, the assessment of public health risk incorporated the use of bacterial source tracking methods to identify the different sources of fecal contamination within monitored regions. Antibiotic resistance pattern analysis was utilised to determine the extent of human faecal contamination, with the outcomes utilised for providing a more indicative public health assessment. Finally, the outcomes of both the soil suitability assessment and ground and surface water monitoring was utilised for the development of the integrated risk framework. The research outcomes achieved through this project enabled the primary research aims and objects to be accomplished. This in turn would enable Gold Coast City Council to provide more appropriate assessment and management guidelines based on robust scientific knowledge which will ultimately ensure that the potential environmental and public health impacts resulting from on-site wastewater treatment is minimised. As part of the implementation of suitable management strategies, a critical point monitoring program (CPM) was formulated. This entailed the identification of the key critical parameters that contribute to the characterised risks at monitored locations within the study area. The CPM will allow more direct procedures to be implemented, targeting the specific hazards at sensitive areas throughout Gold Coast region.
Resumo:
Efficient management of domestic wastewater is a primary requirement for human well being. Failure to adequately address issues of wastewater collection, treatment and disposal can lead to adverse public health and environmental impacts. The increasing spread of urbanisation has led to the conversion of previously rural land into urban developments and the more intensive development of semi urban areas. However the provision of reticulated sewerage facilities has not kept pace with this expansion in urbanisation. This has resulted in a growing dependency on onsite sewage treatment. Though considered only as a temporary measure in the past, these systems are now considered as the most cost effective option and have become a permanent feature in some urban areas. This report is the first of a series of reports to be produced and is the outcome of a research project initiated by the Brisbane City Council. The primary objective of the research undertaken was to relate the treatment performance of onsite sewage treatment systems with soil conditions at site, with the emphasis being on septic tanks. This report consists of a ‘state of the art’ review of research undertaken in the arena of onsite sewage treatment. The evaluation of research brings together significant work undertaken locally and overseas. It focuses mainly on septic tanks in keeping with the primary objectives of the project. This report has acted as the springboard for the later field investigations and analysis undertaken as part of the project. Septic tanks still continue to be used widely due to their simplicity and low cost. Generally the treatment performance of septic tanks can be highly variable due to numerous factors, but a properly designed, operated and maintained septic tank can produce effluent of satisfactory quality. The reduction of hydraulic surges from washing machines and dishwashers, regular removal of accumulated septage and the elimination of harmful chemicals are some of the practices that can improve system performance considerably. The relative advantages of multi chamber over single chamber septic tanks is an issue that needs to be resolved in view of the conflicting research outcomes. In recent years, aerobic wastewater treatment systems (AWTS) have been gaining in popularity. This can be mainly attributed to the desire to avoid subsurface effluent disposal, which is the main cause of septic tank failure. The use of aerobic processes for treatment of wastewater and the disinfection of effluent prior to disposal is capable of producing effluent of a quality suitable for surface disposal. However the field performance of these has been disappointing. A significant number of these systems do not perform to stipulated standards and quality can be highly variable. This is primarily due to houseowner neglect or ignorance of correct operational and maintenance procedures. The other problems include greater susceptibility to shock loadings and sludge bulking. As identified in literature a number of design features can also contribute to this wide variation in quality. The other treatment processes in common use are the various types of filter systems. These include intermittent and recirculating sand filters. These systems too have their inherent advantages and disadvantages. Furthermore as in the case of aerobic systems, their performance is very much dependent on individual houseowner operation and maintenance practices. In recent years the use of biofilters has attracted research interest and particularly the use of peat. High removal rates of various wastewater pollutants have been reported in research literature. Despite these satisfactory results, leachate from peat has been reported in various studies. This is an issue that needs further investigations and as such biofilters can still be considered to be in the experimental stage. The use of other filter media such as absorbent plastic and bark has also been reported in literature. The safe and hygienic disposal of treated effluent is a matter of concern in the case of onsite sewage treatment. Subsurface disposal is the most common and the only option in the case of septic tank treatment. Soil is an excellent treatment medium if suitable conditions are present. The processes of sorption, filtration and oxidation can remove the various wastewater pollutants. The subsurface characteristics of the disposal area are among the most important parameters governing process performance. Therefore it is important that the soil and topographic conditions are taken into consideration in the design of the soil absorption system. Seepage trenches and beds are the common systems in use. Seepage pits or chambers can be used where subsurface conditions warrant, whilst above grade mounds have been recommended for a variety of difficult site conditions. All these systems have their inherent advantages and disadvantages and the preferable soil absorption system should be selected based on site characteristics. The use of gravel as in-fill for beds and trenches is open to question. It does not contribute to effluent treatment and has been shown to reduce the effective infiltrative surface area. This is due to physical obstruction and the migration of fines entrained in the gravel, into the soil matrix. The surface application of effluent is coming into increasing use with the advent of aerobic treatment systems. This has the advantage that treatment is undertaken on the upper soil horizons, which is chemically and biologically the most effective in effluent renovation. Numerous research studies have demonstrated the feasibility of this practice. However the overriding criteria is the quality of the effluent. It has to be of exceptionally good quality in order to ensure that there are no resulting public health impacts due to aerosol drift. This essentially is the main issue of concern, due to the unreliability of the effluent quality from aerobic systems. Secondly, it has also been found that most householders do not take adequate care in the operation of spray irrigation systems or in the maintenance of the irrigation area. Under these circumstances surface disposal of effluent should be approached with caution and would require appropriate householder education and stringent compliance requirements. However despite all this, the efficiency with which the process is undertaken will ultimately rest with the individual householder and this is where most concern rests. Greywater too should require similar considerations. Surface irrigation of greywater is currently being permitted in a number of local authority jurisdictions in Queensland. Considering the fact that greywater constitutes the largest fraction of the total wastewater generated in a household, it could be considered to be a potential resource. Unfortunately in most circumstances the only pretreatment that is required to be undertaken prior to reuse is the removal of oil and grease. This is an issue of concern as greywater can considered to be a weak to medium sewage as it contains primary pollutants such as BOD material and nutrients and may also include microbial contamination. Therefore its use for surface irrigation can pose a potential health risk. This is further compounded by the fact that most householders are unaware of the potential adverse impacts of indiscriminate greywater reuse. As in the case of blackwater effluent reuse, there have been suggestions that greywater should also be subjected to stringent guidelines. Under these circumstances the surface application of any wastewater requires careful consideration. The other option available for the disposal effluent is the use of evaporation systems. The use of evapotranspiration systems has been covered in this report. Research has shown that these systems are susceptible to a number of factors and in particular to climatic conditions. As such their applicability is location specific. Also the design of systems based solely on evapotranspiration is questionable. In order to ensure more reliability, the systems should be designed to include soil absorption. The successful use of these systems for intermittent usage has been noted in literature. Taking into consideration the issues discussed above, subsurface disposal of effluent is the safest under most conditions. This is provided the facility has been designed to accommodate site conditions. The main problem associated with subsurface disposal is the formation of a clogging mat on the infiltrative surfaces. Due to the formation of the clogging mat, the capacity of the soil to handle effluent is no longer governed by the soil’s hydraulic conductivity as measured by the percolation test, but rather by the infiltration rate through the clogged zone. The characteristics of the clogging mat have been shown to be influenced by various soil and effluent characteristics. Secondly, the mechanisms of clogging mat formation have been found to be influenced by various physical, chemical and biological processes. Biological clogging is the most common process taking place and occurs due to bacterial growth or its by-products reducing the soil pore diameters. Biological clogging is generally associated with anaerobic conditions. The formation of the clogging mat provides significant benefits. It acts as an efficient filter for the removal of microorganisms. Also as the clogging mat increases the hydraulic impedance to flow, unsaturated flow conditions will occur below the mat. This permits greater contact between effluent and soil particles thereby enhancing the purification process. This is particularly important in the case of highly permeable soils. However the adverse impacts of the clogging mat formation cannot be ignored as they can lead to significant reduction in the infiltration rate. This in fact is the most common cause of soil absorption systems failure. As the formation of the clogging mat is inevitable, it is important to ensure that it does not impede effluent infiltration beyond tolerable limits. Various strategies have been investigated to either control clogging mat formation or to remediate its severity. Intermittent dosing of effluent is one such strategy that has attracted considerable attention. Research conclusions with regard to short duration time intervals are contradictory. It has been claimed that the intermittent rest periods would result in the aerobic decomposition of the clogging mat leading to a subsequent increase in the infiltration rate. Contrary to this, it has also been claimed that short duration rest periods are insufficient to completely decompose the clogging mat, and the intermediate by-products that form as a result of aerobic processes would in fact lead to even more severe clogging. It has been further recommended that the rest periods should be much longer and should be in the range of about six months. This entails the provision of a second and alternating seepage bed. The other concepts that have been investigated are the design of the bed to meet the equilibrium infiltration rate that would eventuate after clogging mat formation; improved geometry such as the use of seepage trenches instead of beds; serial instead of parallel effluent distribution and low pressure dosing of effluent. The use of physical measures such as oxidation with hydrogen peroxide and replacement of the infiltration surface have been shown to be only of short-term benefit. Another issue of importance is the degree of pretreatment that should be provided to the effluent prior to subsurface application and the influence exerted by pollutant loadings on the clogging mat formation. Laboratory studies have shown that the total mass loadings of BOD and suspended solids are important factors in the formation of the clogging mat. It has also been found that the nature of the suspended solids is also an important factor. The finer particles from extended aeration systems when compared to those from septic tanks will penetrate deeper into the soil and hence will ultimately cause a more dense clogging mat. However the importance of improved pretreatment in clogging mat formation may need to be qualified in view of other research studies. It has also shown that effluent quality may be a factor in the case of highly permeable soils but this may not be the case with fine structured soils. The ultimate test of onsite sewage treatment system efficiency rests with the final disposal of effluent. The implication of system failure as evidenced from the surface ponding of effluent or the seepage of contaminants into the groundwater can be very serious as it can lead to environmental and public health impacts. Significant microbial contamination of surface and groundwater has been attributed to septic tank effluent. There are a number of documented instances of septic tank related waterborne disease outbreaks affecting large numbers of people. In a recent incident, the local authority was found liable for an outbreak of viral hepatitis A and not the individual septic tank owners as no action had been taken to remedy septic tank failure. This illustrates the responsibility placed on local authorities in terms of ensuring the proper operation of onsite sewage treatment systems. Even a properly functioning soil absorption system is only capable of removing phosphorus and microorganisms. The nitrogen remaining after plant uptake will not be retained in the soil column, but will instead gradually seep into the groundwater as nitrate. Conditions for nitrogen removal by denitrification are not generally present in a soil absorption bed. Dilution by groundwater is the only treatment available for reducing the nitrogen concentration to specified levels. Therefore based on subsurface conditions, this essentially entails a maximum allowable concentration of septic tanks in a given area. Unfortunately nitrogen is not the only wastewater pollutant of concern. Relatively long survival times and travel distances have been noted for microorganisms originating from soil absorption systems. This is likely to happen if saturated conditions persist under the soil absorption bed or due to surface runoff of effluent as a result of system failure. Soils have a finite capacity for the removal of phosphorus. Once this capacity is exceeded, phosphorus too will seep into the groundwater. The relatively high mobility of phosphorus in sandy soils have been noted in the literature. These issues have serious implications in the design and siting of soil absorption systems. It is not only important to ensure that the system design is based on subsurface conditions but also the density of these systems in given areas is a critical issue. This essentially involves the adoption of a land capability approach to determine the limitations of an individual site for onsite sewage disposal. The most limiting factor at a particular site would determine the overall capability classification for that site which would also dictate the type of effluent disposal method to be adopted.
Resumo:
Background and Aims: Falls and fall-related injuries result in reduced functioning, loss of independence, premature nursing home admissions and mortality. Malnutrition is associated with falls in the acute setting, but little is known about malnutrition and falls risk in the community. The aim of this study was to assess the association between malnutrition risk, falls risk and falls over a one-year period in community-dwelling older adults. Methods: Two hundred and fifty four subjects >65 years of age were recruited to participate in a study in order to identify risk factors for falls. Malnutrition risk was determined using the Mini Nutritional Assessment–Short Form. Results: 28.6% had experienced a fall and according to the Mini Nutritional Assessment-Short Form 3.9% (n=10) of subjects were at risk of malnutrition. There were no associations between malnutrition risk, the risk of falls, nor actual falls in healthy older adults in the community setting. Conclusions: There was a low prevalence of malnutrition risk in this sample of community-dwelling older adults and no association between nutritional risk and falls. Screening as part of a falls prevention program should focus on the risk of developing malnutrition as this is associated with falls.
Resumo:
With the advent of large-scale wind farms and their integration into electrical grids, more uncertainties, constraints and objectives must be considered in power system development. It is therefore necessary to introduce risk-control strategies into the planning of transmission systems connected with wind power generators. This paper presents a probability-based multi-objective model equipped with three risk-control strategies. The model is developed to evaluate and enhance the ability of the transmission system to protect against overload risks when wind power is integrated into the power system. The model involves: (i) defining the uncertainties associated with wind power generators with probability measures and calculating the probabilistic power flow with the combined use of cumulants and Gram-Charlier series; (ii) developing three risk-control strategies by specifying the smallest acceptable non-overload probability for each branch and the whole system, and specifying the non-overload margin for all branches in the whole system; (iii) formulating an overload risk index based on the non-overload probability and the non-overload margin defined; and (iv) developing a multi-objective transmission system expansion planning (TSEP) model with the objective functions composed of transmission investment and the overload risk index. The presented work represents a superior risk-control model for TSEP in terms of security, reliability and economy. The transmission expansion planning model with the three risk-control strategies demonstrates its feasibility in the case study using two typical power systems
Resumo:
Aims: This study investigated the association between the basal (rest) insulin-signaling proteins, Akt, and the Akt substrate AS160, metabolic risk factors, inflammatory markers and aerobic fitness, in middle-aged women with varying numbers of metabolic risk factors for type 2 diabetes. Methods: Sixteen women (n = 16) aged 51.3+/-5.1 (mean +/-SD) years provided muscle biopsies and blood samples at rest. In addition, anthropometric characteristics and aerobic power were assessed and the number of metabolic risk factors for each participant was determined (IDF criteria). Results: The mean number of metabolic risk factors was 1.6+/-1.2. Total Akt was negatively correlated with IL-1 beta (r = -0.45, p = 0.046), IL-6 (r = -0.44, p = 0.052) and TNF-alpha (r = -0.51, p = 0.025). Phosphorylated AS160 was positively correlated with HDL (r = 0.58, p = 0.024) and aerobic fitness (r = 0.51, p = 0.047). Furthermore, a multiple regression analysis revealed that both HDL (t = 2.5, p = 0.032) and VO(2peak) (t = 2.4, p = 0.037) were better predictors for phosphorylated AS160 than TNF-alpha or IL-6 (p>0.05). Conclusions: Elevated inflammatory markers and increased metabolic risk factors may inhibit insulin-signaling protein phosphorylation in middle-aged women, thereby increasing insulin resistance under basal conditions. Furthermore, higher HDL and fitness levels are associated with an increased AS160 phosphorylation, which may in turn reduce insulin resistance.
Resumo:
Abstract Background: As low HDL cholesterol levels are a risk factor for cardiovascular disease, raising HDL cholesterol substantially by inhibiting or modulating cholesteryl ester transfer protein (CETP) may be useful in coronary artery disease. The first CETP inhibitor that went into clinical trial, torcetrapib, was shown to increase the levels of HDL cholesterol, but it also increased cardiovascular outcomes, probably due to an increase in blood pressure and aldosterone secretion, by an off-target mechanism/s. Objective/methods: Dalcetrapib is a new CETP modulator that increases the levels of HDL cholesterol, but does not increase blood pressure or aldosterone secretion. The objective was to evaluate a paper describing the effects of dalcetrapib on carotid and aortic wall thickness in subjects with, or at high risk, of coronary artery disease; the dal-PLAQUE study. Results: dal-PLAQUE showed that dalcetrapib reduced the progression of atherosclerosis and may also reduce the vascular inflammation associated with this, in subjects with, or with high risk of, coronary heart disease, who were already taking statins. Conclusions: These results suggest that modulating CETP with dalcetrapib may be a beneficial mechanism in cardiovascular disease. The results of the dal-HEART series, which includes dal-PLAQUE 1 and 2, and dal-OUTCOMES, when complete, will provide more definitive information about the benefit, or not, of dalcetrapib in coronary artery disease.
Resumo:
Background Total hip arthroplasty (THA) is a commonly performed procedure and numbers are increasing with ageing populations. One of the most serious complications in THA are surgical site infections (SSIs), caused by pathogens entering the wound during the procedure. SSIs are associated with a substantial burden for health services, increased mortality and reduced functional outcomes in patients. Numerous approaches to preventing these infections exist but there is no gold standard in practice and the cost-effectiveness of alternate strategies is largely unknown. Objectives The aim of this project was to evaluate the cost-effectiveness of strategies claiming to reduce deep surgical site infections following total hip arthroplasty in Australia. The objectives were: 1. Identification of competing strategies or combinations of strategies that are clinically relevant to the control of SSI related to hip arthroplasty 2. Evidence synthesis and pooling of results to assess the volume and quality of evidence claiming to reduce the risk of SSI following total hip arthroplasty 3. Construction of an economic decision model incorporating cost and health outcomes for each of the identified strategies 4. Quantification of the effect of uncertainty in the model 5. Assessment of the value of perfect information among model parameters to inform future data collection Methods The literature relating to SSI in THA was reviewed, in particular to establish definitions of these concepts, understand mechanisms of aetiology and microbiology, risk factors, diagnosis and consequences as well as to give an overview of existing infection prevention measures. Published economic evaluations on this topic were also reviewed and limitations for Australian decision-makers identified. A Markov state-transition model was developed for the Australian context and subsequently validated by clinicians. The model was designed to capture key events related to deep SSI occurring within the first 12 months following primary THA. Relevant infection prevention measures were selected by reviewing clinical guideline recommendations combined with expert elicitation. Strategies selected for evaluation were the routine use of pre-operative antibiotic prophylaxis (AP) versus no use of antibiotic prophylaxis (No AP) or in combination with antibiotic-impregnated cement (AP & ABC) or laminar air operating rooms (AP & LOR). The best available evidence for clinical effect size and utility parameters was harvested from the medical literature using reproducible methods. Queensland hospital data were extracted to inform patients’ transitions between model health states and related costs captured in assigned treatment codes. Costs related to infection prevention were derived from reliable hospital records and expert opinion. Uncertainty of model input parameters was explored in probabilistic sensitivity analyses and scenario analyses and the value of perfect information was estimated. Results The cost-effectiveness analysis was performed from a health services perspective using a hypothetical cohort of 30,000 THA patients aged 65 years. The baseline rate of deep SSI was 0.96% within one year of a primary THA. The routine use of antibiotic prophylaxis (AP) was highly cost-effective and resulted in cost savings of over $1.6m whilst generating an extra 163 QALYs (without consideration of uncertainty). Deterministic and probabilistic analysis (considering uncertainty) identified antibiotic prophylaxis combined with antibiotic-impregnated cement (AP & ABC) to be the most cost-effective strategy. Using AP & ABC generated the highest net monetary benefit (NMB) and an incremental $3.1m NMB compared to only using antibiotic prophylaxis. There was a very low error probability that this strategy might not have the largest NMB (<5%). Not using antibiotic prophylaxis (No AP) or using both antibiotic prophylaxis combined with laminar air operating rooms (AP & LOR) resulted in worse health outcomes and higher costs. Sensitivity analyses showed that the model was sensitive to the initial cohort starting age and the additional costs of ABC but the best strategy did not change, even for extreme values. The cost-effectiveness improved for a higher proportion of cemented primary THAs and higher baseline rates of deep SSI. The value of perfect information indicated that no additional research is required to support the model conclusions. Conclusions Preventing deep SSI with antibiotic prophylaxis and antibiotic-impregnated cement has shown to improve health outcomes among hospitalised patients, save lives and enhance resource allocation. By implementing a more beneficial infection control strategy, scarce health care resources can be used more efficiently to the benefit of all members of society. The results of this project provide Australian policy makers with key information about how to efficiently manage risks of infection in THA.
Resumo:
Education is often viewed as a key approach to address sexual-health issues; the current concern is the burgeoning HIV/AIDS epidemic. This ethnographic study investigates the gender practices associated with high-risk sexual behaviour in Papua New Guinea as viewed by educators there. A number of practices, including gender inequality and associated sexual behaviours have been highlighted by male and female participants as escalating PNG’s HIV/AIDS epidemic. The study finds that although participants were well-informed concerning HIV/AIDS, they had varying beliefs concerning the prevailing gender/sexual issues involved in escalating highrisk behaviour and how to address the problem. The study further examines the behavioural beliefs and intentions of the educators themselves. Subsequently, within the data a number of underpinning factors, pertaining to gender, education and life experience, were found to be related to the behaviour beliefs and intentions of participants towards embracing change with regard to behaviours associated with gender equality in PNG. These factors appeared to encourage participants to adopt healthier gender and sexual behavioural intentions and, arguably, could provide the basis for ways to help address the gender inequality and high-risk behaviours associated with HIV/AIDS in PNG.
Resumo:
Aims: To identify risk factors for major Adverse Events (AEs) and to develop a nomogram to predict the probability of such AEs in individual patients who have surgery for apparent early stage endometrial cancer. Methods: We used data from 753 patients who were randomized to either total laparoscopic hysterectomy or total abdominal hysterectomy in the LACE trial. Serious adverse events that prolonged hospital stay or postoperative adverse events (using common terminology criteria 3+, CTCAE V3) were considered major AEs. We analyzed pre-surgical characteristics that were associated with the risk of developing major AEs by multivariate logistic regression. We identified a parsimonious model by backward stepwise logistic regression. The six most significant or clinically important variables were included in the nomogram to predict the risk of major AEs within 6 weeks of surgery and the nomogram was internally validated. Results: Overall, 132 (17.5%) patients had at least one major AE. An open surgical approach (laparotomy), higher Charlson’s medical co-morbidities score, moderately differentiated tumours on curettings, higher baseline ECOG score, higher body mass index and low haemoglobin levels were associated with AE and were used in the nomogram. The bootstrap corrected concordance index of the nomogram was 0.63 and it showed good calibration. Conclusions: Six pre-surgical factors independently predicted the risk of major AEs. This research might form the basis to develop risk reduction strategies to minimize the risk of AEs among patients undergoing surgery for apparent early stage endometrial cancer.
Resumo:
The study presented in this paper reviewed 9,358 accidents which occurred in the U.S. construction industry between 2002 and 2011, in order to understand the relationships between the risk factors and injury severity (e.g. fatalities, hospitalized injuries, or non-hospitalized injuries) and to develop a strategic prevention plan to reduce the likelihood of fatalities where an accident is unavoidable. The study specifically aims to: (1) verify the relationships among risk factors, accident types, and injury severity, (2) determine significant risk factors associated with each accident type that are highly correlated to injury severity, and (3) analyze the impact of the identified key factors on accident and fatality occurrence. The analysis results explained that safety managers’ roles are critical to reducing human-related risks—particularly misjudgement of hazardous situations—through safety training and education, appropriate use of safety devices and proper safety inspection. However, for environment-related factors, the dominant risk factors were different depending on the different accident types. The outcomes of this study will assist safety managers to understand the nature of construction accidents and plan for strategic risk mitigation by prioritizing high frequency risk factors to effectively control accident occurrence and manage the likelihood of fatal injuries on construction sites.
Resumo:
Background: Women who birth in private facilities in Australia are more likely to have a caesarean birth than women who birth in public facilities and these differences remain after accounting for sector differences in the demographic and health risk profiles of women. However, the extent to which women’s preferences and/or freedom to choose their mode of birth further account for differences in the likelihood of caesarean birth between the sectors remains untested. Method: Women who birthed in Queensland, Australia during a two-week period in 2009 were mailed a self-report survey approximately three months after birth. Seven hundred and fifty-seven women provided cross-sectional retrospective data on where they birthed (public or private facility), mode of birth (vaginal or caesarean) and risk factors, along with their preferences and freedom to choose their mode of birth. A hierarchical logistic regression was conducted to determine the extent to which maternal risk and freedom to choose one’s mode of birth explain sector differences in the likelihood of having a caesarean birth. Findings: While there was no sector difference in women’s preference for mode of birth, women who birthed in private facilities had higher odds of feeling able to choose either a vaginal or caesarean birth, and feeling able to choose only a caesarean birth. Women had higher odds of having caesarean birth if they birthed in private facilities, even after accounting for significant risk factors such as age, body mass index, previous caesarean and use of assisted reproductive technology. However, there was no association between place of birth and odds of having a caesarean birth after also accounting for freedom to choose one’s mode of birth. Conclusions: These findings call into question suggestions that the higher caesarean birth rate in the private sector in Australia is attributable to increased levels of obstetric risk among women birthing in the private sector or maternal preferences alone. Instead, the determinants of sector differences in the likelihood of caesarean births are complex and are linked to differences in the perceived choices for mode of birth between women birthing in the private and public systems.
Resumo:
In the past few years, plant biotechnology has gone beyond traditional agricultural production of food, feed and fibre, and moved to address more complex contemporary health, social and industrial challenges. The new era involves production of novel pharmaceutical products, speciality and fine chemicals, phytoremediation and production of renewable energy resources to replace non-renewable fossil fuels. Plants have been shown to provide a genuine and low-cost alternative production system for high-value products. Currently, the principal plant-made products include antibodies, feed additives, vaccine antigens and hormones for human and animal health, and industrial proteins. Despite the unique advantages of scalability, cost and product safety, issues of politics, environmental impact, regulation and socioeconomics still limit the adoption of biopharmaceuticals, especially in the developing world. Plant-based production systems have further complicated biosafety, gene flow and environmental impact assessments with generally genetically modified plants, topics that are already partially understood. This article provides a background to biopharming, highlighting basic considerations for risk assessment and regulation in developing countries, with an emphasis on plant-based vaccine production in South Africa.
Resumo:
Truancy is recognised as an indicator of engagement in high-risk behaviours for adolescents. Injuries from road related risk behaviours continue to be a leading cause of death and disability for early adolescents (13-14 years). The aim of this research is to determine the extent to which truancy relates to increased risk of road related injuries for early adolescents. Four hundred and twenty-seven Year 9 students (13-14 years) from five high schools in Queensland, Australia, completed a questionnaire about their perceptions of risk and recent injury experience. Self-reported injuries were assessed by the Extended Adolescent Injury Checklist (E-AIC). Injuries resulting from motorcycle use, bicycle use, vehicle use (as passenger or driver), and as a pedestrian were measured for the preceding three months. Students were also asked to indicate whether they sought medical attention for their injuries. Truancy rates were assessed from self-reported skipping class or wagging school over the same three month period. The findings explore the relationship between early adolescent truancy and road related injuries. The relationship between road related injuries and truancy was analysed separately for males and females. Results of this study revealed that road related injuries and reports of associated medical treatment are higher for young people who engage in truancy when compared with non-truant adolescents. The results of this study contribute knowledge about truancy as a risk factor for engagement in road related risks. The findings have the potential to enhance school policies and injury prevention programs if emphasis is placed on increasing school attendance as a safety measure to decrease road related injuries for young adolescents.