948 resultados para deterministic safety analysis
Resumo:
La prima parte di questo lavoro di tesi tratta dell’interazione tra un bacino di laminazione e il sottostante acquifero: è in fase di progetto, infatti, la costruzione di una cassa di espansione sul torrente Baganza, a monte della città di Parma. L’obiettivo di tale intervento è di ridurre il rischio di esondazione immagazzinando temporaneamente, in un serbatoio artificiale, la parte più pericolosa del volume di piena che verrebbe rilasciata successivamente con portate che possono essere agevolmente contenute nel tratto cittadino del torrente. L’acquifero è stato preliminarmente indagato e monitorato permettendone la caratterizzazione litostratigrafica. La stratigrafia si può riassumere in una sequenza di strati ghiaioso-sabbiosi con successione di lenti d’argilla più o meno spesse e continue, distinguendo due acquiferi differenti (uno freatico ed uno confinato). Nel presente studio si fa riferimento al solo acquifero superficiale che è stato modellato numericamente, alle differenze finite, per mezzo del software MODFLOW_2005. L'obiettivo del presente lavoro è di rappresentare il sistema acquifero nelle condizioni attuali (in assenza di alcuna opera) e di progetto. La calibrazione è stata condotta in condizioni stazionarie utilizzando i livelli piezometrici raccolti nei punti d’osservazione durante la primavera del 2013. I valori di conducibilità idraulica sono stati stimati per mezzo di un approccio geostatistico Bayesiano. Il codice utilizzato per la stima è il bgaPEST, un software gratuito per la soluzione di problemi inversi fortemente parametrizzati, sviluppato sulla base dei protocolli del software PEST. La metodologia inversa stima il campo di conducibilità idraulica combinando osservazioni sullo stato del sistema (livelli piezometrici nel caso in esame) e informazioni a-priori sulla struttura dei parametri incogniti. La procedura inversa richiede il calcolo della sensitività di ciascuna osservazione a ciascuno dei parametri stimati; questa è stata valutata in maniera efficiente facendo ricorso ad una formulazione agli stati aggiunti del codice in avanti MODFLOW_2005_Adjoint. I risultati della metodologia sono coerenti con la natura alluvionale dell'acquifero indagato e con le informazioni raccolte nei punti di osservazione. Il modello calibrato può quindi essere utilizzato come supporto alla progettazione e gestione dell’opera di laminazione. La seconda parte di questa tesi tratta l'analisi delle sollecitazioni indotte dai percorsi di flusso preferenziali causati da fenomeni di piping all’interno dei rilevati arginali. Tali percorsi preferenziali possono essere dovuti alla presenza di gallerie scavate da animali selvatici. Questo studio è stato ispirato dal crollo del rilevato arginale del Fiume Secchia (Modena), che si è verificato in gennaio 2014 a seguito di un evento alluvionale, durante il quale il livello dell'acqua non ha mai raggiunto la sommità arginale. La commissione scientifica, la cui relazione finale fornisce i dati utilizzati per questo studio, ha attribuito, con molta probabilità, il crollo del rilevato alla presenza di tane di animali. Con lo scopo di analizzare il comportamento del rilevato in condizioni integre e in condizioni modificate dall'esistenza di un tunnel che attraversa il manufatto arginale, è stato realizzato un modello numerico 3D dell’argine mediante i noti software Femwater e Feflow. I modelli descrivono le infiltrazioni all'interno del rilevato considerando il terreno in entrambe le porzioni sature ed insature, adottando la tecnica agli elementi finiti. La tana è stata rappresentata da elementi con elevata permeabilità e porosità, i cui valori sono stati modificati al fine di valutare le diverse influenze sui flussi e sui contenuti idrici. Per valutare se le situazioni analizzate presentino o meno il verificarsi del fenomeno di erosione, sono stati calcolati i valori del fattore di sicurezza. Questo è stato valutato in differenti modi, tra cui quello recentemente proposto da Richards e Reddy (2014), che si riferisce al criterio di energia cinetica critica. In ultima analisi è stato utilizzato il modello di Bonelli (2007) per calcolare il tempo di erosione ed il tempo rimanente al collasso del rilevato.
Resumo:
For more than forty years, research has been on going in the use of the computer in the processing of natural language. During this period methods have evolved, with various parsing techniques and grammars coming to prominence. Problems still exist, not least in the field of Machine Translation. However, one of the successes in this field is the translation of sublanguage. The present work reports Deterministic Parsing, a relatively new parsing technique, and its application to the sublanguage of an aircraft maintenance manual for Machine Translation. The aim has been to investigate the practicability of using Deterministic Parsers in the analysis stage of a Machine Translation system. Machine Translation, Sublanguage and parsing are described in general terms with a review of Deterministic parsing systems, pertinent to this research, being presented in detail. The interaction between machine Translation, Sublanguage and Parsing, including Deterministic parsing, is also highlighted. Two types of Deterministic Parser have been investigated, a Marcus-type parser, based on the basic design of the original Deterministic parser (Marcus, 1980) and an LR-type Deterministic Parser for natural language, based on the LR parsing algorithm. In total, four Deterministic Parsers have been built and are described in the thesis. Two of the Deterministic Parsers are prototypes from which the remaining two parsers to be used on sublanguage have been developed. This thesis reports the results of parsing by the prototypes, a Marcus-type parser and an LR-type parser which have a similar grammatical and linguistic range to the original Marcus parser. The Marcus-type parser uses a grammar of production rules, whereas the LR-type parser employs a Definite Clause Grammar(DGC).
Resumo:
This thesis describes a study of the content and applicability of BS8800:1996 Guide to occupational health and safety management systems. The research is presented chronologically, with literature review and content analysis of SMS related guides and standards interwoven with two elements of qualitative empirical work. The first of these was carried out shortly after publication of BS8800 in 1996, a 'before-the-event' investigation of how organisations were intending to approach SMS implementation. The challenges faced by these organisations are reviewed against standard management theory, suggesting that the initial motivation for SMS implementation governs the approach organisations will adopt to guidance such as BS8800. The second phase of empirical work was undertaken in the context of OHSAS 18001, an auditable protocol based on BS8800, which allows organisations to certify their safety management systems. A discussion of the evolution of certifiable safety management system is presented, highlighting the similarities and differences between this, BS8800, SMS and wider management system standards. A case study then reviews the experiences of a catering company that implemented 18001, motivated by the opportunity for certification as a business benefit. The empirical work is used to comment on the guidance provided by BS8800, within its evolved role as guidance organisations may use for implementation of a SMS to be certified according to the specifications of OHSAS 18001. It is suggested that optimal implementation is facilitated by initial status review, continual improvement and the use of annexes, where there are used to make changes to the existing safety management system. This thesis concludes with a discussion of these elements, highlighting pertinent areas within BS8800 where revision or amendment may be appropriate.
Resumo:
Safety enforcement practitioners within Europe and marketers, designers or manufacturers of consumer products need to determine compliance with the legal test of "reasonable safety" for consumer goods, to reduce the "risks" of injury to the minimum. To enable freedom of movement of products, a method for safety appraisal is required for use as an "expert" system of hazard analysis by non-experts in safety testing of consumer goods for implementation consistently throughout Europe. Safety testing approaches and the concept of risk assessment and hazard analysis are reviewed in developing a model for appraising consumer product safety which seeks to integrate the human factors contribution of risk assessment, hazard perception, and information processing. The model develops a system of hazard identification, hazard analysis and risk assessment which can be applied to a wide range of consumer products through use of a series of systematic checklists and matrices and applies alternative numerical and graphical methods for calculating a final product safety risk assessment score. It is then applied in its pilot form by selected "volunteer" Trading Standards Departments to a sample of consumer products. A series of questionnaires is used to select participating Trading Standards Departments, to explore the contribution of potential subjective influences, to establish views regarding the usability and reliability of the model and any preferences for the risk assessment scoring system used. The outcome of the two stage hazard analysis and risk assessment process is considered to determine consistency in results of hazard analysis, final decisions regarding the safety of the sample product and to determine any correlation in the decisions made using the model and alternative scoring methods of risk assessment. The research also identifies a number of opportunities for future work, and indicates a number of areas where further work has already begun.
Resumo:
The state of the art in productivity measurement and analysis shows a gap between simple methods having little relevance in practice and sophisticated mathematical theory which is unwieldy for strategic and tactical planning purposes, -particularly at company level. An extension is made in this thesis to the method of productivity measurement and analysis based on the concept of added value, appropriate to those companies in which the materials, bought-in parts and services change substantially and a number of plants and inter-related units are involved in providing components for final assembly. Reviews and comparisons of productivity measurement dealing with alternative indices and their problems have been made and appropriate solutions put forward to productivity analysis in general and the added value method in particular. Based on this concept and method, three kinds of computerised models two of them deterministic, called sensitivity analysis and deterministic appraisal, and the third one, stochastic, called risk simulation, have been developed to cope with the planning of productivity and productivity growth with reference to the changes in their component variables, ranging from a single value 'to• a class interval of values of a productivity distribution. The models are designed to be flexible and can be adjusted according to the available computer capacity expected accuracy and 'presentation of the output. The stochastic model is based on the assumption of statistical independence between individual variables and the existence of normality in their probability distributions. The component variables have been forecasted using polynomials of degree four. This model is tested by comparisons of its behaviour with that of mathematical model using real historical data from British Leyland, and the results were satisfactory within acceptable levels of accuracy. Modifications to the model and its statistical treatment have been made as required. The results of applying these measurements and planning models to the British motor vehicle manufacturing companies are presented and discussed.
Resumo:
The number of fatal accidents in the agricultural, horticultural and forestry industry in Great Britain has declined from an annual rate of about 135 in the 1960's to its current level of about 50. Changes to the size and makeup of the population at risk mean that there has been no real improvement in fatal injury incidence rates for farmers. The Health and Safety Executives' (HSE) current system of accident investigation, recording, and analysis is directed primarily at identifying fault, allocating blame, and punishing wrongdoers. Relatively little information is recorded about the personal and organisational factors that contributed to, or failed to prevent accidents. To develop effective preventive strategies, it is important to establish whether errors by the victims and others, occur at the skills, rules, or knowledge level of functioning: are violations of some rule or procedure; or stem from failures to correctly appraise, or control a hazard. A modified version of the Hale and Glendon accident causation model was used to study 230 fatal accidents. Inspectors' original reports were examined and expert judgement applied to identify and categorise the errors committed by each of the parties involved. The highest proportion of errors that led directly to accidents occurred whilst the victims were operating at the knowledge level. The mix and proportion of errors varied considerably between different classes of victim and kind of accident. Different preventive strategies will be needed to address the problem areas identified.
Resumo:
There has been little research in health and safety management concernmg the application of information technology to the field. This thesis attempts to stimulate interest in this area by analysing the value of proprietary health and safety software to proactive health and safety management. The thesis is based upon the detailed software evaluation of seven pieces of proprietary health and safety software. It features a discussion concerning the development of information technology and health and safety management, a review of the key issues identified during the software evaluations, an analysis of the commercial market for this type of software, and a consideration of the broader issues which surround the use of this software. It also includes practical guidance for the evaluation, selection, implementation and maintenance of all health and safety management software. This includes a comprehensive software evaluation chart. The implications of the research are considered for proprietary health and safety software, the application of information technology to health and safety management, and for future research.
Resumo:
This thesis is concerned with certain aspects of the Public Inquiry into the accident at Houghton Main Colliery in June 1975. It examines whether prior to the accident there existed at the Colliery a situation in which too much reliance was being placed upon state regulation and too Iittle upon personal responsibility. I study the phenomenon of state regulation. This is done (a) by analysis of selected writings on state regulation/intervention/interference/bureaucracy (the words are used synonymously) over the last two hundred years, specifically those of Marx on the 1866 Committee on Mines, and (b) by studying Chadwick and Tremenheere, leading and contrasting "bureaucrats" of the mid-nineteenth century. The bureaucratisation of the mining industry over the period 1835-1954 is described, and it is demonstrated that the industry obtained and now possesses those characteristics outlined by Max Weber in his model of bureaucracy. I analyse criticisms of the model and find them to be relevant, in that they facilitate understanding both of the circumstances of the accident and of the Inquiry . Further understanding of the circumstances and causes of the accident was gained by attendance at the lnquiry and by interviewing many of those involved in the Inquiry. I analyse many aspects of the Inquiry - its objectives. structure, procedure and conflicting interests - and find that, although the Inquiry had many of the symbols of bureaucracy, it suffered not from " too much" outside interference. but rather from the coal mining industry's shared belief in its ability to solve its own problems. I found nothing to suggest that, prior to the accident, colliery personnel relied. or were encouraged to rely, "too much" upon state regulation.
Resumo:
In the general introduction of the road-accident phenomenon inside and outside Iran, the results of previous research-works and international conferences and seminars on road-safety have been reviewed. Also a sample-road between Tehran and Mashad has been investigated as a case-study. Examining the road-accident data and iriformation,first: the information presented in road-accident report-forms in developed countries is discussed and, second: the procedures for road-accident data collection in Iran are investigated in detail. The data supplied by Iran Road-Police Central Statistics Office, is analysed, different rates are computed, due comparisons with other nations are made, and the results are discussed. Also such analysis and comparisons are presented for different provinces of Iran. It is concluded that each province with its own natural, geographical, social and economical characteristics possesses its own reasons for the quality and quantity of road-accidents and therefore must receive its own appropriate remedial solutions. The question~ of "what is the cost of road-accidents", "why and how evaluate the cost", "what is the appropriate way of approach to such evaluation" are all discussed and then "the cost of road-accidents in Iran" based on two different approaches: "Gross National Output"and "court award" is computed. It is concluded that this cost is about 1.5 per cent of the country's national product. In Appendix 3 an impressive example is given of the trend of costs and benefits that can be attributed to investment in road-safety measures.
Resumo:
The Report of the Robens Committee (1972), the Health and Safety at Work Act (1974) and the Safety Representatives and Safety Committees Regulations (1977) provide the framework within which this study of certain aspects of health and safety is carried out. The philosophy of self-regulation is considered and its development is set within an historical and an industrial relations perspective. The research uses a case study approach to examine the effectiveness of self-regulation in health and safety in a public sector organisation. Within this approach, methodological triangulation employs the techniques of interviews, questionnaires, observation and documentary analysis. The work is based in four departments of a Scottish Local Authority and particular attention is given to three of the main 'agents' of self-regulation - safety representatives, supervisors and safety committees and their interactions, strategies and effectiveness. A behavioural approach is taken in considering the attitudes, values, motives and interactions of safety representatives and management. Major internal and external factors, which interact and which influence the effectiveness of joint self-regulation of health and safety, are identified. It is emphasised that an organisation cannot be studied without consideration of the context within which it operates both locally and in the wider environment. One of these factors, organisational structure, is described as bureaucratic and the model of a Representative Bureaucracy described by Gouldner (1954) is compared with findings from the present study. An attempt is made to ascertain how closely the Local Authority fits Gouldner's model. This research contributes both to knowledge and to theory in the subject area by providing an in-depth study of self-regulation in a public sector organisation, which when compared with such studies as those of Beaumont (1980, 1981, 1982) highlights some of the differences between the public and private sectors. Both empirical data and hypothetical models are used to provide description and explanation of the operation of the health and safety system in the Local Authority. As data were collected during a dynamic period in economic, political and social terms, the research discusses some of the effects of the current economic recession upon safety organisation.
Resumo:
OBJECTIVE: Recent critiques of incident reporting suggest that its role in managing safety has been over emphasized. The objective of this study was to examine the perceived effectiveness of incident reporting in improving safety in mental health and acute hospital settings by asking staff about their perceptions and experiences. DESIGN: /st>Qualitative research design using documentary analysis and semi-structured interviews. SETTING: /st>Two large teaching hospitals in London; one providing acute and the other mental healthcare. PARTICIPANTS: /st>Sixty-two healthcare practitioners with experience of reporting and analysing incidents. RESULTS: /st>Incident reporting was perceived as having a positive effect on safety, not only by leading to changes in care processes but also by changing staff attitudes and knowledge. Staff discussed examples of both instrumental and conceptual uses of the knowledge generated by incident reports. There are difficulties in using incident reports to improve safety in healthcare at all stages of the incident reporting process. Differences in the risks encountered and the organizational systems developed in the two hospitals to review reported incidents could be linked to the differences we found in attitudes to incident reporting between the two hospitals. CONCLUSION: /st>Incident reporting can be a powerful tool for developing and maintaining an awareness of risks in healthcare practice. Using incident reports to improve care is challenging and the study highlighted the complexities involved and the difficulties faced by staff in learning from incident data.
Resumo:
This study presents some quantitative evidence from a number of simulation experiments on the accuracy of the productivitygrowth estimates derived from growthaccounting (GA) and frontier-based methods (namely data envelopment analysis-, corrected ordinary least squares-, and stochastic frontier analysis-based malmquist indices) under various conditions. These include the presence of technical inefficiency, measurement error, misspecification of the production function (for the GA and parametric approaches) and increased input and price volatility from one period to the next. The study finds that the frontier-based methods usually outperform GA, but the overall performance varies by experiment. Parametric approaches generally perform best when there is no functional form misspecification, but their accuracy greatly diminishes otherwise. The results also show that the deterministic approaches perform adequately even under conditions of (modest) measurement error and when measurement error becomes larger, the accuracy of all approaches (including stochastic approaches) deteriorates rapidly, to the point that their estimates could be considered unreliable for policy purposes.
Resumo:
This paper explains some drawbacks on previous approaches for detecting influential observations in deterministic nonparametric data envelopment analysis models as developed by Yang et al. (Annals of Operations Research 173:89-103, 2010). For example efficiency scores and relative entropies obtained in this model are unimportant to outlier detection and the empirical distribution of all estimated relative entropies is not a Monte-Carlo approximation. In this paper we developed a new method to detect whether a specific DMU is truly influential and a statistical test has been applied to determine the significance level. An application for measuring efficiency of hospitals is used to show the superiority of this method that leads to significant advancements in outlier detection. © 2014 Springer Science+Business Media New York.
Resumo:
Dedicated short-range communications (DSRC) are a promising vehicle communication technique for collaborative road safety applications (CSA). However, road safety applications require highly reliable and timely wireless communications, which present big challenges to DSRC based vehicle networks on effective and robust quality of services (QoS) provisioning due to the random channel access method applied in the DSRC technique. In this paper we examine the QoS control problem for CSA in the DSRC based vehicle networks and presented an overview of the research work towards the QoS control problem. After an analysis of the system application requirements and the DSRC vehicle network features, we propose a framework for cooperative and adaptive QoS control, which is believed to be a key for the success of DSRC on supporting effective collaborative road safety applications. A core design in the proposed QoS control framework is that network feedback and cross-layer design are employed to collaboratively achieve targeted QoS. A design example of cooperative and adaptive rate control scheme is implemented and evaluated, with objective of illustrating the key ideas in the framework. Simulation results demonstrate the effectiveness of proposed rate control schemes in providing highly available and reliable channel for emergency safety messages. © 2013 Wenyang Guan et al.
Resumo:
Objective: The purpose of this study was to determine the extent to which mobility indices (such as walking speed and postural sway), motor initiation, and cognitive function, specifically executive functions, including spatial planning, visual attention, and within participant variability, differentially predicted collisions in the near and far sides of the road with increasing age. Methods: Adults aged over 45 years participated in cognitive tests measuring executive function and visual attention (using Useful Field of View; UFoV®), mobility assessments (walking speed, sit-to-stand, self-reported mobility, and postural sway assessed using motion capture cameras), and gave road crossing choices in a two-way filmed real traffic pedestrian simulation. Results: A stepwise regression model of walking speed, start-up delay variability, and processing speed) explained 49.4% of the variance in near-side crossing errors. Walking speed, start-up delay measures (average & variability), and spatial planning explained 54.8% of the variance in far-side unsafe crossing errors. Start-up delay was predicted by walking speed only (explained 30.5%). Conclusion: Walking speed and start-up delay measures were consistent predictors of unsafe crossing behaviours. Cognitive measures, however, differentially predicted near-side errors (processing speed), and far-side errors (spatial planning). These findings offer potential contributions for identifying and rehabilitating at-risk older pedestrians.