904 resultados para Risk Reduction Engineering Laboratory (U.S.)
Resumo:
Engineering design processes are necessary to attain the requisite standards of integrity for high-assurance safety-related systems. Additionally, human factors design initiatives can provide critical insights that parameterise their development. Unfortunately, the popular perception of human factors as a “forced marriage” between engineering and psychology often provokes views where the ‘human factor’ is perceived as a threat to systems design. Some popular performance-based standards for developing safety-related systems advocate identifying and managing human factors throughout the system lifecycle. However, they also have a tendency to fall short in their guidance on the application of human factors methods and tools, let alone how the outputs generated can be integrated in to various stages of the design process. This case study describes a project that converged engineering with human factors to develop a safety argument for new low-cost railway level crossing technology for system-wide implementation in Australia. The paper enjoins the perspectives of a software engineer and cognitive psychologist and their involvement in the project over two years of collaborative work to develop a safety argument for low-cost level crossing technology. Safety and reliability requirements were informed by applying human factors analytical tools that supported the evaluation and quantification of human reliability where users interfaced with the technology. The project team was confronted with significant challenges in cross-disciplinary engagement, particularly with the complexities of dealing with incongruences in disciplinary language. They were also encouraged to think ‘outside the box’ as to how users of a system interpreted system states and ehaviour. Importantly, some of these states, while considered safe within the boundary of the constituent systems that implemented safety-related functions, could actually lead the users to engage in deviant behaviour. Psychology explained how user compliance could be eroded to levels that effectively undermined levels of risk reduction afforded by systems. Linking the engineering and psychology disciplines intuitively, overall safety performance was improved by introducing technical requirements and making design decisions that minimized the system states and behaviours that led to user deviancy. As a commentary on the utility of transdisciplinary collaboration for technical specification, the processes used to bridge the two disciplines are conceptualised in a graphical model.
Resumo:
This paper proposes a recommendation system that supports process participants in taking risk-informed decisions, with the goal of reducing risks that may arise during process execution. Risk reduction involves decreasing the likelihood and severity of a process fault from occurring. Given a business process exposed to risks, e.g. a financial process exposed to a risk of reputation loss, we enact this process and whenever a process participant needs to provide input to the process, e.g. by selecting the next task to execute or by filling out a form, we suggest to the participant the action to perform which minimizes the predicted process risk. Risks are predicted by traversing decision trees generated from the logs of past process executions, which consider process data, involved resources, task durations and other information elements like task frequencies. When applied in the context of multiple process instances running concurrently, a second technique is employed that uses integer linear programming to compute the optimal assignment of resources to tasks to be performed, in order to deal with the interplay between risks relative to different instances. The recommendation system has been implemented as a set of components on top of the YAWL BPM system and its effectiveness has been evaluated using a real-life scenario, in collaboration with risk analysts of a large insurance company. The results, based on a simulation of the real-life scenario and its comparison with the event data provided by the company, show that the process instances executed concurrently complete with significantly fewer faults and with lower fault severities, when the recommendations provided by our recommendation system are taken into account.
Resumo:
Severe dioxin contamination at Bien Hoa and Da Nang airbases, Vietnam is of international concern. Public Health risk reduction programs were implemented in Bien Hoa in 2007-2009 and in Da Nang in 2009-2011. In 2009 and 2011 we reported the encouraging results of these interventions in improving the knowledge, attitude and practices (KAP) of local residents in reducing the dioxin exposure risk through foods. In 2013 we revisited these dioxin hot spots, aimed to evaluate whether the results of the intervention were maintained and to identify factors affecting the sustainability of the programs. To assess this, 16 in-depth interviews, six focus group discussions, and pre and post intervention KAP surveys were undertaken. 800 respondents from six intervention wards and 200 respondents from Buu Long Ward (the control site) were randomly selected to participate in the surveys. The results showed that as of 2013, the programs were rated as "moderately sustained" with a score of 3.3 out of 5.0 (cut off points 2.5 to <3.5) for Bien Hoa, and "well sustained" with a score of 3.8 out of 5.0 (cut off points 3.5 to <4.5) for Da Nang. Most formal intervention program activities had ceased and dioxin risk communication activities were no longer integrated into local routine health education programs. However, the main outcomes were maintained and were better than that in the control ward. Migration, lack of official guidance from City People's Committees and local authorities as well as the politically sensitive nature of dioxin issues were the main challenges for the sustainability of the programs.
Resumo:
This book represents a landmark effort to probe and analyze the theory and empirics of designing water disaster management policies. It consists of seven chapters that examine, in-depth and comprehensively, issues that are central to crafting effective policies for water disaster management. The authors use historical surveys, institutional analysis, econometric investigations, empirical case studies, and conceptual-theoretical discussions to clarify and illuminate the complex policy process. The specific topics studied in this book include a review and analysis of key policy areas and research priority areas associated with water disaster management, community participation in disaster risk reduction, the economics and politics of ‘green’ flood control, probabilistic flood forecasting for flood risk management, polycentric governance and flood risk management, drought management with the aid of dynamic inter-generational preferences, and how social resilience can inform SA/SIA for adaptive planning for climate change in vulnerable areas. A unique feature of this book is its analysis of the causes and consequences of water disasters and efforts to address them successfully through policy-rich, cross-disciplinary and transnational papers. This book is designed to help enrich the sparse discourse on water disaster management policies and galvanize water professionals to craft creative solutions to tackle water disasters efficiently, equitably, and sustainably. This book should also be of considerable use to disaster management professionals, in general, and natural resource policy analysts.
Resumo:
This study assessed environmental health risk from dioxin in foods and sustainability of risk reduction programs at two heavily contaminated former military sites in Vietnam. The study involved 1000 household surveys, analysis of food samples and in-depth discussions with residents and officials. The findings indicate that more than 40 years after the war, local residents still experience high exposure to dioxin if they consume local high risk foods. Public health intervention programs were rated moderately to well sustained. Internal migration, and lack of clear, official guidance and sensitivity regarding dioxin issues were the main challenges for sustainability of prevention programs.
Resumo:
Background To investigate potential cardiovascular and other effects of long-term pharmacological interleukin 1 (IL-1) inhibition, we studied genetic variants that produce inhibition of IL-1, a master regulator of inflammation. Methods We created a genetic score combining the effects of alleles of two common variants (rs6743376 and rs1542176) that are located upstream of IL1RN, the gene encoding the IL-1 receptor antagonist (IL-1Ra; an endogenous inhibitor of both IL-1α and IL-1β); both alleles increase soluble IL-1Ra protein concentration. We compared effects on inflammation biomarkers of this genetic score with those of anakinra, the recombinant form of IL-1Ra, which has previously been studied in randomised trials of rheumatoid arthritis and other inflammatory disorders. In primary analyses, we investigated the score in relation to rheumatoid arthritis and four cardiometabolic diseases (type 2 diabetes, coronary heart disease, ischaemic stroke, and abdominal aortic aneurysm; 453 411 total participants). In exploratory analyses, we studied the relation of the score to many disease traits and to 24 other disorders of proposed relevance to IL-1 signalling (746 171 total participants). Findings For each IL1RN minor allele inherited, serum concentrations of IL-1Ra increased by 0·22 SD (95% CI 0·18–0·25; 12·5%; p=9·3 × 10−33), concentrations of interleukin 6 decreased by 0·02 SD (−0·04 to −0·01; −1·7%; p=3·5 × 10−3), and concentrations of C-reactive protein decreased by 0·03 SD (−0·04 to −0·02; −3·4%; p=7·7 × 10−14). We noted the effects of the genetic score on these inflammation biomarkers to be directionally concordant with those of anakinra. The allele count of the genetic score had roughly log-linear, dose-dependent associations with both IL-1Ra concentration and risk of coronary heart disease. For people who carried four IL-1Ra-raising alleles, the odds ratio for coronary heart disease was 1·15 (1·08–1·22; p=1·8 × 10−6) compared with people who carried no IL-1Ra-raising alleles; the per-allele odds ratio for coronary heart disease was 1·03 (1·02–1·04; p=3·9 × 10−10). Per-allele odds ratios were 0·97 (0·95–0·99; p=9·9 × 10−4) for rheumatoid arthritis, 0·99 (0·97–1·01; p=0·47) for type 2 diabetes, 1·00 (0·98–1·02; p=0·92) for ischaemic stroke, and 1·08 (1·04–1·12; p=1·8 × 10−5) for abdominal aortic aneurysm. In exploratory analyses, we observed per-allele increases in concentrations of proatherogenic lipids, including LDL-cholesterol, but no clear evidence of association for blood pressure, glycaemic traits, or any of the 24 other disorders studied. Modelling suggested that the observed increase in LDL-cholesterol could account for about a third of the association observed between the genetic score and increased coronary risk. Interpretation Human genetic data suggest that long-term dual IL-1α/β inhibition could increase cardiovascular risk and, conversely, reduce the risk of development of rheumatoid arthritis. The cardiovascular risk might, in part, be mediated through an increase in proatherogenic lipid concentrations. Funding UK Medical Research Council, British Heart Foundation, UK National Institute for Health Research, National Institute for Health Research Cambridge Biomedical Research Centre, European Research Council, and European Commission Framework Programme 7.
Resumo:
A mathematician tends to have an intense relationship with treatises – one which is more akin to that of a historian than that of her colleagues in the ‘hard’ sciences. A book may be a century or two old and still be relevant as a source of information or inspiration, well-thumbed textbooks from youth might be still consulted decades later, and fierce arguments rage about relative merits of different treatments of the same subject. And much like any book-lover, a mathematician is forever arguing with herself whether she can afford to buy this volume or the other. When the price label is in dollars or euros and the salary paid in rupees, this last dilemma is particularly acute.
Resumo:
Seismic microzonation has generally been recognized as the most accepted tool in seismic hazard assessment and risk evaluation. In general, risk reduction can be done by reducing the hazard, the vulnerability or the value at risk. Since the earthquake hazard can not be reduced, one has to concentrate on vulnerability and value at risk. The vulnerability of an urban area / municipalities depends on the vulnerability of infrastructure and redundancies within the infrastructure. The earthquake risk is the damage to buildings along with number of people that are killed / hurt and the economic losses during the event due to an earthquake with a return period corresponding to this time period. The principal approaches one can follow to reduce these losses are to avoid, if possible, high hazard areas for the siting of buildings and infrastructure, and further ensure that the buildings and infrastructure are designed and constructed to resist expected earthquake loads. This can be done if one can assess the hazard at local scales. Seismic microzonation maps provide the basis for scientifically based decision-making to reduce earthquake risk for Govt./public agencies, private owners and the general public. Further, seismic microzonation carried out on an appropriate scale provides a valuable tool for disaster mitigation planning and emergency response planning for urban centers / municipalities. It provides the basis for the identification of the areas of the city / municipality which are most likely to experience serious damage in the event of an earthquake.
Resumo:
Reported distress to an industrial structure from phosphate/sulfate contamination of kaolinitic foundation soil at an industrial location in Southern India prompted this laboratory study. The study examines the short-term effect of sodium sulfate/phosphate contamination on the swell/compression characteristics of a commercial kaolinite. Experimental results showed that the unsaturated contaminated kaolinite specimens exhibited slightly higher swell potentials and lower compressions than the unsaturated uncontaminated kaolinite specimens. It is suggested that the larger double layer promoted by the increased exchangeable sodium ion concentration is responsible for the slightly higher swell potentials and lower compressions of the unsaturated contaminated kaolinite specimens.
Resumo:
This paper presents an overview of the seismic microzonation and the grade/level based study along with methods used for estimating hazard. The principles of seismic microzonation along with some current practices are discussed. Summary of seismic microzonation experiments carried out in India is presented. A detailed work of seismic microzonation of Bangalore has been presented as a case study. In this case study, a seismotectonic map for microzonation area has been developed covering 350 km radius around Bangalore, India using seismicity and seismotectonic parameters of the region. For seismic microzonation Bangalore Mahanagar Palike (BMP) area of 220 km2 has been selected as the study area. Seismic hazard analysis has been carried out using deterministic as well as probabilistic approaches. Synthetic ground motion at 653 locations, recurrence relation and peak ground acceleration maps at rock level have been generated. A detailed site characterization has been carried out using borehole with standard penetration test (SPT) ―N‖ values and geophysical data. The base map and 3-dimensional sub surface borehole model has been generated for study area using geographical information system (GIS). Multichannel analysis of surface wave (MASW)method has been used to generate one-dimensional shear wave velocity profile at 58 locations and two- dimensional profile at 20 locations. These shear wave velocities are used to estimate equivalent shear wave velocity in the study area at every 5m intervals up to a depth of 30m. Because of wider variation in the rock depth, equivalent shear for the soil overburden thickness alone has been estimated and mapped using ArcGIS 9.2. Based on equivalent shear wave velocity of soil overburden thickness, the study area is classified as ―site class D‖. Site response study has been carried out using geotechnical properties and synthetic ground motions with program SHAKE2000.The soil in the study area is classified as soil with moderate amplification potential. Site response results obtained using standard penetration test (SPT) ―N‖ values and shear wave velocity are compared, it is found that the results based on shear wave velocity is lower than the results based on SPT ―N‖ values. Further, predominant frequency of soil column has been estimated based on ambient noise survey measurements using instruments of L4-3D short period sensors equipped with Reftek 24 bit digital acquisition systems. Predominant frequency obtained from site response study is compared with ambient noise survey. In general, predominant frequencies in the study area vary from 3Hz to 12Hz. Due to flat terrain in the study area, the induced effect of land slide possibility is considered to be remote. However, induced effect of liquefaction hazard has been estimated and mapped. Finally, by integrating the above hazard parameters two hazard index maps have been developed using Analytic Hierarchy Process (AHP) on GIS platform. One map is based on deterministic hazard analysis and other map is based on probabilistic hazard analysis. Finally, a general guideline is proposed by bringing out the advantages and disadvantages of different approaches.
Resumo:
Lime stabilization prevails to be the most widely adopted in situ stabilization method for controlling the swell-shrink potentials of expansive soils despite construction difficulties and its ineffectiveness in certain conditions. In addition to the in situ stabilization methods presently practiced, it is theoretically possible to facilitate in situ precipitation of lime in soil by successive permeation of calcium chloride (CaCl2 ) and sodium hydroxide (NaOH) solutions into the expansive soil. In this laboratory investigation, an attempt is made to study the precipitation of lime in soil by successive mixing of CaCl2 and NaOH solutions with the expansive soil in two different sequences.Experimental results indicated that in situ precipitation of lime in soil by sequential mixing of CaCl2 and NaOH solutions with expansive soil developed strong lime-modification and soil-lime pozzolanic reactions. The lime-modification reactions together with the poorly de- veloped cementation products controlled the swelling potential, reduced the plasticity index, and increased the unconfined compressive strength of the expansive clay cured for 24 h. Comparatively, both lime-modification reactions and well-developed crystalline cementation products (formed by lime-soil pozzolanic reactions) contributed to the marked increase in the unconfined compressive strength of the ex-pansive soil that was cured for 7–21 days. Results also show that the sequential mixing of expansive soil with CaCl2 solution followed by NaOH solution is more effective than mixing expansive soil with NaOH solution followed by CaCl2 solution. DOI: 10.1061/(ASCE)MT .1943-5533.0000483. © 2012 American Society of Civil Engineers.
Resumo:
Northeast India is one of the most highly seismically active regions in the world with more than seven earthquakes on an average per year of magnitude 5.0 and above. Reliable seismic hazard assessment could provide the necessary design inputs for earthquake resistant design of structures in this' region. In this study, deterministic as well as probabilistic methods have been attempted for seismic hazard assessment of Tripura and Mizoram states at bedrock level condition. An updated earthquake catalogue was collected from various national and international seismological agencies for the period from 1731 to 2011. The homogenization, declustering and data completeness analysis of events have been carried out before hazard evaluation. Seismicity parameters have been estimated using G R relationship for each source zone. Based on the seismicity, tectonic features and fault rupture mechanism, this region was divided into six major subzones. Region specific correlations were used for magnitude conversion for homogenization of earthquake size. Ground motion equations (Atkinson and Boore 2003; Gupta 2010) were validated with the observed PGA (peak ground acceleration) values before use in the hazard evaluation. In this study, the hazard is estimated using linear sources, identified in and around the study area. Results are presented in the form of PGA using both DSHA (deterministic seismic hazard analysis) and PSHA (probabilistic seismic hazard analysis) with 2 and 10% probability of exceedance in 50 years, and spectral acceleration (T = 0. 2 s, 1.0 s) for both the states (2% probability of exceedance in 50 years). The results are important to provide inputs for planning risk reduction strategies, for developing risk acceptance criteria and financial analysis for possible damages in the study area with a comprehensive analysis and higher resolution hazard mapping.
Resumo:
Leonard Carpenter Panama Canal Collection. Photographs: Dredging, Soldiers, and Ships. [Box 1] from the Special Collections & Area Studies Department, George A. Smathers Libraries, University of Florida.
Resumo:
Bycatch can harm marine ecosystems, reduce biodiversity, lead to injury or mortality of protected species, and have severe economic implications for fisheries. On 12 January 2007, President George W. Bush signed the Magnuson-Stevens Fishery Conservation and Management Reauthorization Act of 2006 (MSRA). The MSRA required the U.S. Secretary of Commerce (Secretary) to establish a Bycatch Reduction Engineering Program (BREP) to develop technological devices and other conservation engineering changes designed to minimize bycatch, seabird interactions, bycatch mortality, and post-release mortality in Federally managed fisheries. The MSRA also required the Secretary to identify nations whose vessels are engaged in the bycatch of protected living marine resources (PLMR’s) under specified circumstances and to certify that these nations have 1) adopted regulatory programs for PLMR’s that are comparable to U.S. programs, taking into account different conditions, and 2) established management plans for PLMR’s that assist in the collection of data to support assessments and conservation of these resources. If a nation fails to take sufficient corrective action and does not receive a positive certification, fishing products from that country may be subject to import prohibitions into the United States. The BREP has made significant progress to develop technological devices and other conservation engineering designed to minimize bycatch, including improvements to bycatch reduction devices and turtle excluder devices in Atlantic and Gulf of Mexico trawl fisheries, gillnets in Northeast fisheries, and trawls in Alaska and Pacific Northwest fisheries. In addition, the international provisions of the MSRA have provided an innovative tool through which the United States can address bycatch by foreign nations. However, the inability of the National Marine Fisheries Service to identify nations whose vessels are engaged in the bycatch of PLMR’s to date will require the development of additional approaches to meet this mandate.
Resumo:
BACKGROUND: A Royal Statistical Society Working Party recently recommended that "Greater use should be made of numerical, as opposed to verbal, descriptions of risk" in first-in-man clinical trials. This echoed the view of many clinicians and psychologists about risk communication. As the clinical trial industry expands rapidly across the globe, it is important to understand risk communication in Asian countries. METHODS: We conducted a cognitive experiment about participation in a hypothetical clinical trial of a pain relief medication and a survey in cancer and arthritis patients in Singapore. In part 1 of the experiment, the patients received information about the risk of side effects in one of three formats (frequency, percentage and verbal descriptor) and in one of two sequences (from least to most severe and from most to least severe), and were asked about their willingness to participate. In part 2, the patients received information about the risk in all three formats, in the same sequence, and were again asked about their willingness to participate. A survey of preference for risk presentation methods and usage of verbal descriptors immediately followed. RESULTS: Willingness to participate and the likelihood of changing one's decision were not affected by the risk presentation methods. Most patients indicated a preference for the frequency format, but patients with primary school or no formal education were indifferent. While the patients used the verbal descriptors "very common", "common" and "very rare" in ways similar to the European Commission's Guidelines, their usage of the descriptors "uncommon" and "rare" was substantially different from the EU's. CONCLUSION: In this sample of Asian cancer and arthritis patients, risk presentation format had no impact on willingness to participate in a clinical trial. However, there is a clear preference for the frequency format. The lay use of verbal descriptors was substantially different from the EU's.