948 resultados para deterministic safety analysis


Relevância:

30.00% 30.00%

Publicador:

Resumo:

This is an empirical study whose purpose was to examine the process of innovation adoption as an adaptive response by a public organization and its subunits existing under varying degrees of environmental uncertainty. Meshing organization innovation research and contingency theory to form a theoretical framework, an exploratory case study design was undertaken in a large, metropolitan government located in an area with the fourth highest prevalence rate of HIV/AIDS in the country. A number of environmental and organizational factors were examined for their influence upon decision making in the adoption/non-adoption as well as implementation of any number of AIDS-related policies, practices, and programs.^ The major findings of the study are as follows. For the county government itself (macro level), no AIDS-specific workplace policies have been adopted. AIDS activities (AIDS education, AIDS Task Force, AIDS Coordinator, etc.), adopted county-wide early in the epidemic, have all been abandoned. Worker infection rates, in the aggregate and throughout the epidemic have been small. As a result, absent co-worker conflict (isolated and negligible), no increase in employee health care costs, no litigation regarding discrimination, and no major impact on workforce productivity, AIDS has basically become a non-issue at the strategic core of the organization. At the departmental level, policy adoption decisions varied widely. Here the predominant issue is occupational risk, i.e., both objective as well as perceived. As expected, more AIDS-related activities (policies, practices, and programs) were found in departments with workers known to have significant risk for exposure to the AIDS virus (fire rescue, medical examiner, police, etc.). AIDS specific policies, in the form of OSHA's Bloodborn Pathogen Standard, took place primarily because they were legislatively mandated. Union participation varied widely, although not necessarily based upon worker risk. In several departments, the union was a primary factor bringing about adoption decisions. Additional factors were identified and included organizational presence of AIDS expertise, availability of slack resources, and the existence of a policy champion. Other variables, such as subunit size, centralization of decision making, and formalization were not consistent factors explaining adoption decisions. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Protecting confidential information from improper disclosure is a fundamental security goal. While encryption and access control are important tools for ensuring confidentiality, they cannot prevent an authorized system from leaking confidential information to its publicly observable outputs, whether inadvertently or maliciously. Hence, secure information flow aims to provide end-to-end control of information flow. Unfortunately, the traditionally-adopted policy of noninterference, which forbids all improper leakage, is often too restrictive. Theories of quantitative information flow address this issue by quantifying the amount of confidential information leaked by a system, with the goal of showing that it is intuitively "small" enough to be tolerated. Given such a theory, it is crucial to develop automated techniques for calculating the leakage in a system. ^ This dissertation is concerned with program analysis for calculating the maximum leakage, or capacity, of confidential information in the context of deterministic systems and under three proposed entropy measures of information leakage: Shannon entropy leakage, min-entropy leakage, and g-leakage. In this context, it turns out that calculating the maximum leakage of a program reduces to counting the number of possible outputs that it can produce. ^ The new approach introduced in this dissertation is to determine two-bit patterns, the relationships among pairs of bits in the output; for instance we might determine that two bits must be unequal. By counting the number of solutions to the two-bit patterns, we obtain an upper bound on the number of possible outputs. Hence, the maximum leakage can be bounded. We first describe a straightforward computation of the two-bit patterns using an automated prover. We then show a more efficient implementation that uses an implication graph to represent the two- bit patterns. It efficiently constructs the graph through the use of an automated prover, random executions, STP counterexamples, and deductive closure. The effectiveness of our techniques, both in terms of efficiency and accuracy, is shown through a number of case studies found in recent literature. ^

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Highway Safety Manual (HSM) estimates roadway safety performance based on predictive models that were calibrated using national data. Calibration factors are then used to adjust these predictive models to local conditions for local applications. The HSM recommends that local calibration factors be estimated using 30 to 50 randomly selected sites that experienced at least a total of 100 crashes per year. It also recommends that the factors be updated every two to three years, preferably on an annual basis. However, these recommendations are primarily based on expert opinions rather than data-driven research findings. Furthermore, most agencies do not have data for many of the input variables recommended in the HSM. This dissertation is aimed at determining the best way to meet three major data needs affecting the estimation of calibration factors: (1) the required minimum sample sizes for different roadway facilities, (2) the required frequency for calibration factor updates, and (3) the influential variables affecting calibration factors. In this dissertation, statewide segment and intersection data were first collected for most of the HSM recommended calibration variables using a Google Maps application. In addition, eight years (2005-2012) of traffic and crash data were retrieved from existing databases from the Florida Department of Transportation. With these data, the effect of sample size criterion on calibration factor estimates was first studied using a sensitivity analysis. The results showed that the minimum sample sizes not only vary across different roadway facilities, but they are also significantly higher than those recommended in the HSM. In addition, results from paired sample t-tests showed that calibration factors in Florida need to be updated annually. To identify influential variables affecting the calibration factors for roadway segments, the variables were prioritized by combining the results from three different methods: negative binomial regression, random forests, and boosted regression trees. Only a few variables were found to explain most of the variation in the crash data. Traffic volume was consistently found to be the most influential. In addition, roadside object density, major and minor commercial driveway densities, and minor residential driveway density were also identified as influential variables.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This research explores Bayesian updating as a tool for estimating parameters probabilistically by dynamic analysis of data sequences. Two distinct Bayesian updating methodologies are assessed. The first approach focuses on Bayesian updating of failure rates for primary events in fault trees. A Poisson Exponentially Moving Average (PEWMA) model is implemnented to carry out Bayesian updating of failure rates for individual primary events in the fault tree. To provide a basis for testing of the PEWMA model, a fault tree is developed based on the Texas City Refinery incident which occurred in 2005. A qualitative fault tree analysis is then carried out to obtain a logical expression for the top event. A dynamic Fault Tree analysis is carried out by evaluating the top event probability at each Bayesian updating step by Monte Carlo sampling from posterior failure rate distributions. It is demonstrated that PEWMA modeling is advantageous over conventional conjugate Poisson-Gamma updating techniques when failure data is collected over long time spans. The second approach focuses on Bayesian updating of parameters in non-linear forward models. Specifically, the technique is applied to the hydrocarbon material balance equation. In order to test the accuracy of the implemented Bayesian updating models, a synthetic data set is developed using the Eclipse reservoir simulator. Both structured grid and MCMC sampling based solution techniques are implemented and are shown to model the synthetic data set with good accuracy. Furthermore, a graphical analysis shows that the implemented MCMC model displays good convergence properties. A case study demonstrates that Likelihood variance affects the rate at which the posterior assimilates information from the measured data sequence. Error in the measured data significantly affects the accuracy of the posterior parameter distributions. Increasing the likelihood variance mitigates random measurement errors, but casuses the overall variance of the posterior to increase. Bayesian updating is shown to be advantageous over deterministic regression techniques as it allows for incorporation of prior belief and full modeling uncertainty over the parameter ranges. As such, the Bayesian approach to estimation of parameters in the material balance equation shows utility for incorporation into reservoir engineering workflows.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Workplace violence is defined as an act of abuse, threatening behaviour, intimidation, or assault on a person in his or her place of employment. Unfortunately, such violence is a reality for nurses. These take the form of physical, verbal, and threating behaviours, and harassment. Violence, particularly verbal abuse, is so prevalent that it is often considered “part of the job” and can contribute to many negative professional and personal effects for nurses. Therefore, it is important to understand what influences an individual to become violent in order to suggest and support initiatives to decrease it. A literature review and consultations with key stakeholders were conducted to gather relevant information regarding violence committed by patients and others visiting mental health care settings. Through data analysis, two relevant themes were revealed: reporting and interventions. Reporting incidents of workplace violence is important to track and quantify aggressive episodes, thus emphasizing its seriousness. Nurses may differ in the perception of what constitutes violence, underreport incidents, and feel a sense of futility when reported violence continues. In addition, cumbersome methods of reporting can be a hindrance to the reporting process. Six areas of potential interventions were identified to increase safety for nurses. These are staffing, de-escalation training, environmental considerations, addictions services, organizational support, and consequences. All findings were summarized in a document to be presented to the leadership of the Mental Health and Addictions program within the local health care authority. The goal is to offer recommendations to lead to a decrease in workplace aggression and increased safety for nurses in the acute care psychiatric setting.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Acknowledgements This article is based on a doctoral research project of the first author which was sponsored by an international drilling rig operator. The views presented are those of the authors and should not be taken to represent the position or policy of the sponsor. The authors wish to thank the industrial supervisor and the drilling experts for their contribution and patience, as well as Aberdeen Drilling School for allowing the first author to attend one of their well control courses.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

A landfill represents a complex and dynamically evolving structure that can be stochastically perturbed by exogenous factors. Both thermodynamic (equilibrium) and time varying (non-steady state) properties of a landfill are affected by spatially heterogenous and nonlinear subprocesses that combine with constraining initial and boundary conditions arising from the associated surroundings. While multiple approaches have been made to model landfill statistics by incorporating spatially dependent parameters on the one hand (data based approach) and continuum dynamical mass-balance equations on the other (equation based modelling), practically no attempt has been made to amalgamate these two approaches while also incorporating inherent stochastically induced fluctuations affecting the process overall. In this article, we will implement a minimalist scheme of modelling the time evolution of a realistic three dimensional landfill through a reaction-diffusion based approach, focusing on the coupled interactions of four key variables - solid mass density, hydrolysed mass density, acetogenic mass density and methanogenic mass density, that themselves are stochastically affected by fluctuations, coupled with diffusive relaxation of the individual densities, in ambient surroundings. Our results indicate that close to the linearly stable limit, the large time steady state properties, arising out of a series of complex coupled interactions between the stochastically driven variables, are scarcely affected by the biochemical growth-decay statistics. Our results clearly show that an equilibrium landfill structure is primarily determined by the solid and hydrolysed mass densities only rendering the other variables as statistically "irrelevant" in this (large time) asymptotic limit. The other major implication of incorporation of stochasticity in the landfill evolution dynamics is in the hugely reduced production times of the plants that are now approximately 20-30 years instead of the previous deterministic model predictions of 50 years and above. The predictions from this stochastic model are in conformity with available experimental observations.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Federal Aviation Administration (FAA) Office of Commercial Space Transportation (AST) has set specific rules and generic guidelines to cover experimental and operational flights by industry forerunners such as Virgin Galactic and XCOR. One such guideline Advisory Circular (AC) 437.55-1[1] contains exemplar hazard analyses for spacecraft designers and operators to follow under an experimental permit. The FAA's rules and guidelines have also been ratified in a report to the United States Congress, Analysis of Human Space Flight Safety[2] which cites that the industry is too immature and has 'insufficient data' to be proscriptive and that 'defining a minimum set of criteria for human spaceflight service providers is potentially problematic' in order not to 'stifle the emerging industry'. The authors of this paper acknowledge the immaturity of the industry and discuss the problematic issues that Design Organisations and Operators now face.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Olivia framework is a set of concepts and measures that, when mature, will allow users to describe, in a consistent and integrated manner, everything about individuals and institutions that is of potential interest to social policy. The present paper summarizes the current stage of development in achieving this highly ambitious goal. The current version of the framework supports analysis of social trends and policy responses from many perspectives: • The point-in-time, resource-flow perspectives that underlie most traditional, economics-based policy analysis. • Life-course perspectives, including both transitions/trajectories analysis and asset-based analysis. • Spatial perspectives that anchor people in space and history and that provide a link to macro-analysis. • The perspective of the purposes/goals of individuals and institutions, including the objectives of different types of government programming. The concepts of the framework, which are all potentially measurable, provide a language that can support integrated analysis in all these areas at a much finer level of description than is customary. It provides a language that is especially well suited for analysis of the incremental policy changes that are typical of a mature welfare state. It supports both qualitative and quantitative analysis, enabling some integration between the two. It supports citizen-centric as well as a government-centric view of social policy. In its current version, the concepts are most highly developed as they related to social policies as they related to labour markets, equality and social integration, care-giving, immigration, income security, sustainability, and social and economic well-being more generally. However the paper points to likely extensions in the areas of health, justice and safety.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The first objective of this research was to develop closed-form and numerical probabilistic methods of analysis that can be applied to otherwise conventional methods of unreinforced and geosynthetic reinforced slopes and walls. These probabilistic methods explicitly include random variability of soil and reinforcement, spatial variability of the soil, and cross-correlation between soil input parameters on probability of failure. The quantitative impact of simultaneously considering the influence of random and/or spatial variability in soil properties in combination with cross-correlation in soil properties is investigated for the first time in the research literature. Depending on the magnitude of these statistical descriptors, margins of safety based on conventional notions of safety may be very different from margins of safety expressed in terms of probability of failure (or reliability index). The thesis work also shows that intuitive notions of margin of safety using conventional factor of safety and probability of failure can be brought into alignment when cross-correlation between soil properties is considered in a rigorous manner. The second objective of this thesis work was to develop a general closed-form solution to compute the true probability of failure (or reliability index) of a simple linear limit state function with one load term and one resistance term expressed first in general probabilistic terms and then migrated to a LRFD format for the purpose of LRFD calibration. The formulation considers contributions to probability of failure due to model type, uncertainty in bias values, bias dependencies, uncertainty in estimates of nominal values for correlated and uncorrelated load and resistance terms, and average margin of safety expressed as the operational factor of safety (OFS). Bias is defined as the ratio of measured to predicted value. Parametric analyses were carried out to show that ignoring possible correlations between random variables can lead to conservative (safe) values of resistance factor in some cases and in other cases to non-conservative (unsafe) values. Example LRFD calibrations were carried out using different load and resistance models for the pullout internal stability limit state of steel strip and geosynthetic reinforced soil walls together with matching bias data reported in the literature.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

One of the global phenomena with threats to environmental health and safety is artisanal mining. There are ambiguities in the manner in which an ore-processing facility operates which hinders the mining capacity of these miners in Ghana. These problems are reviewed on the basis of current socio-economic, health and safety, environmental, and use of rudimentary technologies which limits fair-trade deals to miners. This research sought to use an established data-driven, geographic information (GIS)-based system employing the spatial analysis approach for locating a centralized processing facility within the Wassa Amenfi-Prestea Mining Area (WAPMA) in the Western region of Ghana. A spatial analysis technique that utilizes ModelBuilder within the ArcGIS geoprocessing environment through suitability modeling will systematically and simultaneously analyze a geographical dataset of selected criteria. The spatial overlay analysis methodology and the multi-criteria decision analysis approach were selected to identify the most preferred locations to site a processing facility. For an optimal site selection, seven major criteria including proximity to settlements, water resources, artisanal mining sites, roads, railways, tectonic zones, and slopes were considered to establish a suitable location for a processing facility. Site characterizations and environmental considerations, incorporating identified constraints such as proximity to large scale mines, forest reserves and state lands to site an appropriate position were selected. The analysis was limited to criteria that were selected and relevant to the area under investigation. Saaty’s analytical hierarchy process was utilized to derive relative importance weights of the criteria and then a weighted linear combination technique was applied to combine the factors for determination of the degree of potential site suitability. The final map output indicates estimated potential sites identified for the establishment of a facility centre. The results obtained provide intuitive areas suitable for consideration

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Background and Objectives: Mobility limitations are a prevalent issue in older adult populations, and an important determinant of disability and mortality. Neighborhood conditions are key determinants of mobility and perception of safety may be one such determinant. Women have more mobility limitations than men, a phenomenon known as the gender mobility gap. The objective of this work was to validate a measure of perception of safety, examine the relationship between neighborhood perception of safety and mobility limitations in seniors, and explore if these effects vary by gender. Methods: This study was cross-sectional, using questionnaire data collected from community-dwelling older adults from four sites in Canada, Colombia, and Brazil. The exposure variable was the neighborhood aggregated Perception of Safety (PoS) scale, derived from the Physical and Social Disorder (PSD) scale by Sampson and Raudenbush. Its construct validity was verified using factor analyses and correlation with similar measures. The Mobility Assessment Tool – short form (MAT-sf), a video-based measure validated cross-culturally in the studied populations, was used to assess mobility limitations. Based on theoretical models, covariates were included in the analysis, both at the neighborhood level (SES, social capital, and built environment) and the individual level (age, gender, education, income, chronic illnesses, depression, cognitive function, BMI, and social participation). Multilevel modeling was used in order to account for neighborhood clustering. Gender specific analyses were carried out. SAS and M-plus were used in this study. Results: PoS was validated across all sites. It loaded in a single factor, after excluding two items, with a Cronbach α value of approximately 0.86. Mobility limitations were present in 22.08% of the sample, 16.32% among men and 27.41% among women. Neighborhood perception of safety was significantly associated with mobility limitations when controlling for all covariates, with an OR of 0.84 (CI 95%: 0.73-0.96), indicating lower odds of having mobility limitations as neighborhood perception of safety improves. Gender did not affect this relationship despite women being more likely to have mobility limitations and live in neighborhoods with poor perception of safety. Conclusion: Neighborhood perception of safety affected the prevalence of mobility limitations in older adults in the studied population.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Major food adulteration and contamination events occur with alarming regularity and are known to be episodic, with the question being not if but when another large-scale food safety/integrity incident will occur. Indeed, the challenges of maintaining food security are now internationally recognised. The ever increasing scale and complexity of food supply networks can lead to them becoming significantly more vulnerable to fraud and contamination, and potentially dysfunctional. This can make the task of deciding which analytical methods are more suitable to collect and analyse (bio)chemical data within complex food supply chains, at targeted points of vulnerability, that much more challenging. It is evident that those working within and associated with the food industry are seeking rapid, user-friendly methods to detect food fraud and contamination, and rapid/high-throughput screening methods for the analysis of food in general. In addition to being robust and reproducible, these methods should be portable and ideally handheld and/or remote sensor devices, that can be taken to or be positioned on/at-line at points of vulnerability along complex food supply networks and require a minimum amount of background training to acquire information rich data rapidly (ergo point-and-shoot). Here we briefly discuss a range of spectrometry and spectroscopy based approaches, many of which are commercially available, as well as other methods currently under development. We discuss a future perspective of how this range of detection methods in the growing sensor portfolio, along with developments in computational and information sciences such as predictive computing and the Internet of Things, will together form systems- and technology-based approaches that significantly reduce the areas of vulnerability to food crime within food supply chains. As food fraud is a problem of systems and therefore requires systems level solutions and thinking.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the past decade, several major food safety crises originated from problems with feed. Consequently, there is an urgent need for early detection of fraudulent adulteration and contamination in the feed chain. Strategies are presented for two specific cases, viz. adulterations of (i) soybean meal with melamine and other types of adulterants/contaminants and (ii) vegetable oils with mineral oil, transformer oil or other oils. These strategies comprise screening at the feed mill or port of entry with non-destructive spectroscopic methods (NIRS and Raman), followed by post-screening and confirmation in the laboratory with MS-based methods. The spectroscopic techniques are suitable for on-site and on-line applications. Currently they are suited to detect fraudulent adulteration at relatively high levels but not to detect low level contamination. The potential use of the strategies for non-targeted analysis is demonstrated.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The internal combustion (IC) engines exploits only about 30% of the chemical energy ejected through combustion, whereas the remaining part is rejected by means of cooling system and exhausted gas. Nowadays, a major global concern is finding sustainable solutions for better fuel economy which in turn results in a decrease of carbon dioxide (CO2) emissions. The Waste Heat Recovery (WHR) is one of the most promising techniques to increase the overall efficiency of a vehicle system, allowing the recovery of the heat rejected by the exhaust and cooling systems. In this context, Organic Rankine Cycles (ORCs) are widely recognized as a potential technology to exploit the heat rejected by engines to produce electricity. The aim of the present paper is to investigate a WHR system, designed to collect both coolant and exhausted gas heats, coupled with an ORC cycle for vehicle applications. In particular, a coolant heat exchanger (CLT) allows the heat exchange between the water coolant and the ORC working fluid, whereas the exhausted gas heat is recovered by using a secondary circuit with diathermic oil. By using an in-house numerical model, a wide range of working conditions and ORC design parameters are investigated. In particular, the analyses are focused on the regenerator location inside the ORC circuits. Five organic fluids, working in both subcritical and supercritical conditions, have been selected in order to detect the most suitable configuration in terms of energy and exergy efficiencies.