28 resultados para deterministic safety analysis
em CentAUR: Central Archive University of Reading - UK
Resumo:
Techniques for the coherent generation and detection of electromagnetic radiation in the far infrared, or terahertz, region of the electromagnetic spectrum have recently developed rapidly and may soon be applied for in vivo medical imaging. Both continuous wave and pulsed imaging systems are under development, with terahertz pulsed imaging being the more common method. Typically a pump and probe technique is used, with picosecond pulses of terahertz radiation generated from femtosecond infrared laser pulses, using an antenna or nonlinear crystal. After interaction with the subject either by transmission or reflection, coherent detection is achieved when the terahertz beam is combined with the probe laser beam. Raster scanning of the subject leads to an image data set comprising a time series representing the pulse at each pixel. A set of parametric images may be calculated, mapping the values of various parameters calculated from the shape of the pulses. A safety analysis has been performed, based on current guidelines for skin exposure to radiation of wavelengths 2.6 µm–20 mm (15 GHz–115 THz), to determine the maximum permissible exposure (MPE) for such a terahertz imaging system. The international guidelines for this range of wavelengths are drawn from two U.S. standards documents. The method for this analysis was taken from the American National Standard for the Safe Use of Lasers (ANSI Z136.1), and to ensure a conservative analysis, parameters were drawn from both this standard and from the IEEE Standard for Safety Levels with Respect to Human Exposure to Radio Frequency Electromagnetic Fields (C95.1). The calculated maximum permissible average beam power was 3 mW, indicating that typical terahertz imaging systems are safe according to the current guidelines. Further developments may however result in systems that will exceed the calculated limit. Furthermore, the published MPEs for pulsed exposures are based on measurements at shorter wavelengths and with pulses of longer duration than those used in terahertz pulsed imaging systems, so the results should be treated with caution.
Resumo:
This is the first half of a two-part paper which deals with the social theoretic assumptions underlying system dynamics. The motivation is that clarification in this area can help mainstream social scientists to understand how our field relates to their literature, methods and concerns. Part I has two main sections. The aim of the first is to answer the question: How do the ideas of system dynamics relate to traditional social theories? The theoretic assumptions of the field are seldom explicit but rather are implicit in its practice. The range of system dynamics practice is therefore considered and related to a framework - widely used in both operational research (OR) and systems science - that organises the assumptions behind traditional social theoretic paradigms. Distinct and surprisingly varied groupings of practice are identified, making it difficult to place system dynamics in any one paradigm with any certainty. The difficulties of establishing a social theoretic home for system dynamics are exemplified in the second main section. This is done by considering the question: Is system dynamics deterministic? An analysis shows that attempts to relate system dynamics to strict notions of voluntarism or determinism quickly indicate that the field does not fit with either pole of this dichotomous, and strictly paradigmatic, view. Part I therefore concludes that definitively placing system dynamics with respect to traditional social theories is highly problematic. The scene is therefore set for Part II of the paper, which proposes an innovative and potentially fruitful resolution to this problem.
Resumo:
Iatrogenic errors and patient safety in clinical processes are an increasing concern. The quality of process information in hardcopy or electronic form can heavily influence clinical behaviour and decision making errors. Little work has been undertaken to assess the safety impact of clinical process planning documents guiding the clinical actions and decisions. This paper investigates the clinical process documents used in elective surgery and their impact on latent and active clinical errors. Eight clinicians from a large health trust underwent extensive semi- structured interviews to understand their use of clinical documents, and their perceived impact on errors and patient safety. Samples of the key types of document used were analysed. Theories of latent organisational and active errors from the literature were combined with the EDA semiotics model of behaviour and decision making to propose the EDA Error Model. This model enabled us to identify perceptual, evaluation, knowledge and action error types and approaches to reducing their causes. The EDA error model was then used to analyse sample documents and identify error sources and controls. Types of knowledge artefact structures used in the documents were identified and assessed in terms of safety impact. This approach was combined with analysis of the questionnaire findings using existing error knowledge from the literature. The results identified a number of document and knowledge artefact issues that give rise to latent and active errors and also issues concerning medical culture and teamwork together with recommendations for further work.
Resumo:
Medication safety and errors are a major concern in care homes. In addition to the identification of incidents, there is a need for a comprehensive system description to avoid the danger of introducing interventions that have unintended consequences and are therefore unsustainable. The aim of the study was to explore the impact and uniqueness of Work Domain Analysis (WDA) to facilitate an in-depth understanding of medication safety problems within the care home system and identify the potential benefits of WDA to design safety interventions to improve medication safety. A comprehensive, systematic and contextual overview of the care home medication system was developed for the first time. The novel use of the Abstraction Hierarchy (AH) to analyse medication errors revealed the value of the AH to guide a comprehensive analysis of errors and generate system improvement recommendations that took into account the contextual information of the wider system.
Resumo:
The globalization of trade in fish has created many challenges for the developing world specifically with regard to food safety and quality. International organisations have established a good basis for standards in international trade. Whilst these requirements are frequently embraced by the major importers (such as Japan, the EU and the USA), they often impose additional safety requirements and regularly identify batches which fail to meet their strict standards. Creating an effective national seafood control system which meets both the internal national needs as well the requirements for the export market can be challenging. Many countries adopt a dual system where seafood products for the major export markets are subject to tight control whilst the majority of the products (whether for the local market or for more regional trade) are less tightly controlled. With regional liberalization also occurring, deciding on appropriate controls is complex. In the Sultanate of Oman, fisheries production is one of the countries' chief sources of economic revenue after oil production and is a major source of the national food supply. In this paper the structure of the fish supply chain has been analysed and highlighted the different routes operating for the different markets. Although much of the fish are consumed within Oman, there is a major export trade to the local regional markets. Much smaller quantities meet the more stringent standards imposed by the major importing countries and exports to these are limited. The paper has considered the development of the Omani fish control system including the key legislative documents and the administrative structures that have been developed. Establishing modern controls which satisfy the demands of the major importers is possible but places additional costs on businesses. Enhanced controls such as HACCP and other management standards are required but can be difficult to justify when alternative markets do not specify these. These enhanced controls do however provide additional consumer protection and can bring benefits to local consumers. The Omani government is attempting to upgrade the system of controls and has made tremendous progress toward the implementation of HACCP and introducing enhanced management systems into its industrial sector. The existence of strengthened legislative and government support, including subsidies, has encouraged some businesses to implement HACCP. The current control systems have been reviewed and a SWOT analysis approach used to identify key factors for their future development. The study shows that seafood products in the supply chain are often exposed to lengthy handling and distribution process before reaching the consumers, a typical issue faced by many developing countries. As seafood products are often perishable, they safety is compromised if not adequately controlled. The enforcement of current food safety laws in the Sultanate of Oman is shared across various government agencies. Consequently, there is a need to harmonize all regulatory requirements, enhancing the domestic food protection and to continue to work towards a fully risk-based approach in order to compete successfully in the global market.
Resumo:
Bloom-forming and toxin-producing cyanobacteria remain a persistent nuisance across the world. Modelling of cyanobacteria in freshwaters is an important tool for understanding their population dynamics and predicting the location and timing of the bloom events in lakes and rivers. A new deterministic-mathematical model was developed, which simulates the growth and movement of cyanobacterial blooms in river systems. The model focuses on the mathematical description of the bloom formation, vertical migration and lateral transport of colonies within river environments by taking into account the major factors that affect the cyanobacterial bloom formation in rivers including, light, nutrients and temperature. A technique called generalised sensitivity analysis was applied to the model to identify the critical parameter uncertainties in the model and investigates the interaction between the chosen parameters of the model. The result of the analysis suggested that 8 out of 12 parameters were significant in obtaining the observed cyanobacterial behaviour in a simulation. It was found that there was a high degree of correlation between the half-saturation rate constants used in the model.
Resumo:
Accurately and reliably identifying the actual number of clusters present with a dataset of gene expression profiles, when no additional information on cluster structure is available, is a problem addressed by few algorithms. GeneMCL transforms microarray analysis data into a graph consisting of nodes connected by edges, where the nodes represent genes, and the edges represent the similarity in expression of those genes, as given by a proximity measurement. This measurement is taken to be the Pearson correlation coefficient combined with a local non-linear rescaling step. The resulting graph is input to the Markov Cluster (MCL) algorithm, which is an elegant, deterministic, non-specific and scalable method, which models stochastic flow through the graph. The algorithm is inherently affected by any cluster structure present, and rapidly decomposes a graph into cohesive clusters. The potential of the GeneMCL algorithm is demonstrated with a 5730 gene subset (IGS) of the Van't Veer breast cancer database, for which the clusterings are shown to reflect underlying biological mechanisms. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
During the last 15 years, a series of food scares and crises (BSE, dioxin. foot and mouth disease) have seriously under-mined public confidence in food producers and operators and their capacity to produce safe food. As a result, food safety has become a top priority of the European legislative authorities and systems of national food control have been tightened up and have included the establishment of the European Food Safety Authority. In Greece a law creating the Hellenic Food Safety Authority has been approved. The main objectives of this Authority are to promote the food security to consumers and inform them of any changes or any development in the food and health sector. The paper reviews the general structure of the current food control system in Greece. It describes the structure and the mission of the Hellenic Food Safety Authority and explains the strategy to carry out inspections and the analysis of the preliminary results of such inspections. Details are also given of the personnel training and certification and accreditation standards to be met by the Authority by the end of 2004. (c) 2005 Elsevier Ltd. All rights reserved.
Resumo:
Increased concerns over food safety have led to the adoption of international guidance on the key elements for national food control systems. This guidance had been used to conduct an initial assessment of the status of the food control systems in the countries belonging to the Gulf Cooperation Council. Our research has identified how the countries have been attempting to enhance their food control systems. Although the countries have different approaches to food control management, cooperation is leading to increased harmonization of legislation and food control practices. Progress is being made but there is evidence of some weakness where additional efforts may be needed. (c) 2009 Elsevier Ltd. All rights reserved.
Resumo:
In this paper we present error analysis for a Monte Carlo algorithm for evaluating bilinear forms of matrix powers. An almost Optimal Monte Carlo (MAO) algorithm for solving this problem is formulated. Results for the structure of the probability error are presented and the construction of robust and interpolation Monte Carlo algorithms are discussed. Results are presented comparing the performance of the Monte Carlo algorithm with that of a corresponding deterministic algorithm. The two algorithms are tested on a well balanced matrix and then the effects of perturbing this matrix, by small and large amounts, is studied.
Resumo:
We provide a system identification framework for the analysis of THz-transient data. The subspace identification algorithm for both deterministic and stochastic systems is used to model the time-domain responses of structures under broadband excitation. Structures with additional time delays can be modelled within the state-space framework using additional state variables. We compare the numerical stability of the commonly used least-squares ARX models to that of the subspace N4SID algorithm by using examples of fourth-order and eighth-order systems under pulse and chirp excitation conditions. These models correspond to structures having two and four modes simultaneously propagating respectively. We show that chirp excitation combined with the subspace identification algorithm can provide a better identification of the underlying mode dynamics than the ARX model does as the complexity of the system increases. The use of an identified state-space model for mode demixing, upon transformation to a decoupled realization form is illustrated. Applications of state-space models and the N4SID algorithm to THz transient spectroscopy as well as to optical systems are highlighted.