93 resultados para Operability


Relevância:

10.00% 10.00%

Publicador:

Resumo:

The initial aim of this research was to investigate the application of expert Systems, or Knowledge Base Systems technology to the automated synthesis of Hazard and Operability Studies. Due to the generic nature of Fault Analysis problems and the way in which Knowledge Base Systems work, this goal has evolved into a consideration of automated support for Fault Analysis in general, covering HAZOP, Fault Tree Analysis, FMEA and Fault Diagnosis in the Process Industries. This thesis described a proposed architecture for such an Expert System. The purpose of the System is to produce a descriptive model of faults and fault propagation from a description of the physical structure of the plant. From these descriptive models, the desired Fault Analysis may be produced. The way in which this is done reflects the complexity of the problem which, in principle, encompasses the whole of the discipline of Process Engineering. An attempt is made to incorporate the perceived method that an expert uses to solve the problem; keywords, heuristics and guidelines from techniques such as HAZOP and Fault Tree Synthesis are used. In a truly Expert System, the performance of the system is strongly dependent on the high quality of the knowledge that is incorporated. This expert knowledge takes the form of heuristics or rules of thumb which are used in problem solving. This research has shown that, for the application of fault analysis heuristics, it is necessary to have a representation of the details of fault propagation within a process. This helps to ensure the robustness of the system - a gradual rather than abrupt degradation at the boundaries of the domain knowledge.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Mineral wool insulation material applied to the primary cooling circuit of a nuclear reactor maybe damaged in the course of a loss of coolant accident (LOCA). The insulation material released by the leak may compromise the operation of the emergency core cooling system (ECCS), as it maybe transported together with the coolant in the form of mineral wool fiber agglomerates (MWFA) suspensions to the containment sump strainers, which are mounted at the inlet of the ECCS to keep any debris away from the emergency cooling pumps. In the further course of the LOCA, the MWFA may block or penetrate the strainers. In addition to the impact of MWFA on the pressure drop across the strainers, corrosion products formed over time may also accumulate in the fiber cakes on the strainers, which can lead to a significant increase in the strainer pressure drop and result in cavitation in the ECCS. Therefore, it is essential to understand the transport characteristics of the insulation materials in order to determine the long-term operability of nuclear reactors, which undergo LOCA. An experimental and theoretical study performed by the Helmholtz-Zentrum Dresden-Rossendorf and the Hochschule Zittau/Görlitz1 is investigating the phenomena that maybe observed in the containment vessel during a primary circuit coolant leak. The study entails the generation of fiber agglomerates, the determination of their transport properties in single and multi-effect experiments and the long-term effects that particles formed due to corrosion of metallic containment internals by the coolant medium have on the strainer pressure drop. The focus of this presentation is on the numerical models that are used to predict the transport of MWFA by CFD simulations. A number of pseudo-continuous dispersed phases of spherical wetted agglomerates can represent the MWFA. The size, density, the relative viscosity of the fluid-fiber agglomerate mixture and the turbulent dispersion all affect how the fiber agglomerates are transported. In the cases described here, the size is kept constant while the density is modified. This definition affects both the terminal velocity and volume fraction of the dispersed phases. Only one of the single effect experimental scenarios is described here that are used in validation of the numerical models. The scenario examines the suspension and horizontal transport of the fiber agglomerates in a racetrack type channel. The corresponding experiments will be described in an accompanying presentation (see abstract of Seeliger et al.).

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Commercial process simulators are increasing interest in the chemical engineer education. In this paper, the use of commercial dynamic simulation software, D-SPICE® and K-Spice®, for three different chemical engineering courses is described and discussed. The courses cover the following topics: basic chemical engineering, operability and safety analysis and process control. User experiences from both teachers and students are presented. The benefits of dynamic simulation as an additional teaching tool are discussed and summarized. The experiences confirm that commercial dynamic simulators provide realistic training and can be successfully integrated into undergraduate and graduate teaching, laboratory courses and research. © 2012 The Institution of Chemical Engineers.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper contributes a new methodology called Waste And Source-matter ANalyses (WASAN) which supports a group in building agreeable actions for safely minimising avoidable waste. WASAN integrates influences from the Operational Research (OR) methodologies/philosophies of Problem Structuring Methods, Systems Thinking, simulation modelling and sensitivity analysis as well as industry approaches of Waste Management Hierarchy, Hazard Operability (HAZOP) Studies and As Low As Reasonably Practicable (ALARP). The paper shows how these influences are compiled into facilitative structures that support managers in developing recommendations on how to reduce avoidable waste production. WASAN is being designed as Health and Safety Executive Guidance on what constitutes good decision making practice for the companies that manage nuclear sites. In this paper we report and reflect on its use in two soft OR/problem structuring workshops conducted on radioactive waste in the nuclear industry. Crown Copyright © 2010.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with legal unorthodoxy. The main idea is to study the so-called unorthodox taxes Hungary has adopted in recent years. The study of unorthodox taxes will be preceded by a more general discussion of how law is made under unorthodoxy, and what are the special features of unorthodox legal policy. Unorthodoxy challenges equality before the law and is critical towards mass democracies. It also raises doubts on the operability of the rule of law, relying on personal skills, or loyalty, rather than on impersonal mechanisms arising from checks and balances as developed by the division of political power. Besides, for lack of legal suppositions, legislation suffers from casuistry and regulatory capture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

This paper deals with legal unorthodoxy. The main idea is to study the so-called unorthodox taxes Hungary has adopted in recent years. The study of unorthodox taxes will be preceded by a more general discussion of how law is made under unorthodoxy, and what are the special features of unorthodox legal policy. Unorthodoxy challenges equality before the law and is critical towards mass democracies. It also raises doubts on the operability of the rule of law, relying on personal skills, or loyalty, rather than on impersonal mechanisms arising from checks and balances as developed by the division of political power. Besides, for lack of legal suppositions, legislation suffers from casuistry and regulatory capture.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The virtual quadrilateral is the coalescence of novel data structures that reduces the storage requirements of spatial data without jeopardizing the quality and operability of the inherent information. The data representative of the observed area is parsed to ascertain the necessary contiguous measures that, when contained, implicitly define a quadrilateral. The virtual quadrilateral then represents a geolocated area of the observed space where all of the measures are the same. The area, contoured as a rectangle, is pseudo-delimited by the opposite coordinates of the bounding area. Once defined, the virtual quadrilateral is representative of an area in the observed space and is represented in a database by the attributes of its bounding coordinates and measure of its contiguous space. Virtual quadrilaterals have been found to ensure a lossless reduction of the physical storage, maintain the implied features of the data, facilitate the rapid retrieval of vast amount of the represented spatial data and accommodate complex queries. The methods presented herein demonstrate that virtual quadrilaterals are created quite easily, are stable and versatile objects in a database and have proven to be beneficial to exigent spatial data applications such as geographic information systems. ^

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The lack of analytical models that can accurately describe large-scale networked systems makes empirical experimentation indispensable for understanding complex behaviors. Research on network testbeds for testing network protocols and distributed services, including physical, emulated, and federated testbeds, has made steady progress. Although the success of these testbeds is undeniable, they fail to provide: 1) scalability, for handling large-scale networks with hundreds or thousands of hosts and routers organized in different scenarios, 2) flexibility, for testing new protocols or applications in diverse settings, and 3) inter-operability, for combining simulated and real network entities in experiments. This dissertation tackles these issues in three different dimensions. First, we present SVEET, a system that enables inter-operability between real and simulated hosts. In order to increase the scalability of networks under study, SVEET enables time-dilated synchronization between real hosts and the discrete-event simulator. Realistic TCP congestion control algorithms are implemented in the simulator to allow seamless interactions between real and simulated hosts. SVEET is validated via extensive experiments and its capabilities are assessed through case studies involving real applications. Second, we present PrimoGENI, a system that allows a distributed discrete-event simulator, running in real-time, to interact with real network entities in a federated environment. PrimoGENI greatly enhances the flexibility of network experiments, through which a great variety of network conditions can be reproduced to examine what-if questions. Furthermore, PrimoGENI performs resource management functions, on behalf of the user, for instantiating network experiments on shared infrastructures. Finally, to further increase the scalability of network testbeds to handle large-scale high-capacity networks, we present a novel symbiotic simulation approach. We present SymbioSim, a testbed for large-scale network experimentation where a high-performance simulation system closely cooperates with an emulation system in a mutually beneficial way. On the one hand, the simulation system benefits from incorporating the traffic metadata from real applications in the emulation system to reproduce the realistic traffic conditions. On the other hand, the emulation system benefits from receiving the continuous updates from the simulation system to calibrate the traffic between real applications. Specific techniques that support the symbiotic approach include: 1) a model downscaling scheme that can significantly reduce the complexity of the large-scale simulation model, resulting in an efficient emulation system for modulating the high-capacity network traffic between real applications; 2) a queuing network model for the downscaled emulation system to accurately represent the network effects of the simulated traffic; and 3) techniques for reducing the synchronization overhead between the simulation and emulation systems.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Global niobium production is presently dominated by three operations, Araxá and Catalão (Brazil), and Niobec (Canada). Although Brazil accounts for over 90% of the world’s niobium production, a number of high grade niobium deposits exist worldwide. The advancement of these deposits depends largely on the development of operable beneficiation flowsheets. Pyrochlore, as the primary niobium mineral, is typically upgraded by flotation with amine collectors at acidic pH following a complicated flowsheet with significant losses of niobium. This research compares the typical two stage flotation flowsheet to a direct flotation process (i.e. elimination of gangue pre-flotation) with the objective of circuit simplification. In addition, the use of a chelating reagent (benzohydroxamic acid, BHA) was studied as an alternative collector for fine grained, highly disseminated pyrochlore. For the amine based reagent system, results showed that while comparable at the laboratory scale, when scaled up to the pilot level the direct flotation process suffered from circuit instability because of high quantities of dissolved calcium in the process water due to stream recirculation and fine calcite dissolution, which ultimately depressed pyrochlore. This scale up issue was not observed in pilot plant operation of the two stage flotation process as a portion of the highly reactive carbonate minerals was removed prior to acid addition. A statistical model was developed for batch flotation using BHA on carbonatite ore (0.25% Nb2O5) that could not be effectively upgraded using the conventional amine reagent scheme. Results showed that it was possible to produce a concentrate containing 1.54% Nb2O5 with 93% Nb recovery in ~15% of the original mass. Fundamental studies undertaken included FT-IR and XPS, which showed the adsorption of both the protonized amine and the neutral amine onto the surface of the pyrochlore (possibly at niobium sites as indicated by detected shifts in the Nb3d binding energy). The results suggest that the preferential flotation of pyrochlore over quartz with amines at low pH levels can be attributed to a difference in critical hemimicelle concentration (CHC) values for the two minerals. BHA was found to be absorbed on pyrochlore surfaces by a similar mechanism to alkyl hydroxamic acid. It is hoped that this work will assist in improving operability of existing pyrochlore flotation circuits and help promote the development of niobium deposits globally. Future studies should focus on investigation into specific gangue mineral depressants and inadvertent activation phenomenon related to BHA flotation of gangue minerals.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Intelligent Tutoring Systems (ITSs) are computerized systems for learning-by-doing. These systems provide students with immediate and customized feedback on learning tasks. An ITS typically consists of several modules that are connected to each other. This research focuses on the distribution of the ITS module that provides expert knowledge services. For the distribution of such an expert knowledge module we need to use an architectural style because this gives a standard interface, which increases the reusability and operability of the expert knowledge module. To provide expert knowledge modules in a distributed way we need to answer the research question: ‘How can we compare and evaluate REST, Web services and Plug-in architectural styles for the distribution of the expert knowledge module in an intelligent tutoring system?’. We present an assessment method for selecting an architectural style. Using the assessment method on three architectural styles, we selected the REST architectural style as the style that best supports the distribution of expert knowledge modules. With this assessment method we also analyzed the trade-offs that come with selecting REST. We present a prototype and architectural views based on REST to demonstrate that the assessment method correctly scores REST as an appropriate architectural style for the distribution of expert knowledge modules.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

Na medida em que os produtos e os processos de criação são cada vez mais mediados digitalmente, existe uma reflexão recente acerca da relação entre as imagens e as ferramentas usadas para a sua produção. A relação natural e estreita entre a dimensão conceptual e a dimensão física abre a discussão ao nível da semântica e dos processos da projetação e manipulação das imagens, nas quais estão naturalmente incluídas as ferramentas CAD. Tendo o desenho um papel inequívoco e fundamental no exercício da projetação e da modelação 3D é pertinente perceber a relação e a articulação entre estas duas ferramentas. Reconhecendo o desenho como uma ferramenta de domínio físico capaz de expressar o pensamento que opera a transformação de concepções abstratas em concepções concretas, reconhecê-lo refletido na dimensão virtual através de um software CAD 3D não é trivial, já que este, na generalidade, é processado através de um pensamento cujo contexto é distante da materialidade. Metodologicamente, abordaremos esta questão procurando a verificação da hipótese através de uma proposta de exercício prático que procura avaliar o efeito que as imagens analógicas poderão ter sobre o reconhecimento e operatividade da ferramenta Blender num enquadramento académico. Pretende-se, pois, perceber como o desenho analógico pode integrar o processo de modelação 3D e qual a relação que mantém com quem elas opera. A articulação do desenho com as ferramentas de produção de design, especificamente CAD 3D, permitirá compreender na especialidade a articulação entre ferramentas de diferentes naturezas tanto no processo da projetação quanto na criação de artefactos visuais. Assim como poderá lançar a discussão acerca das estratégias pedagógicas de ensino do desenho e do 3D num curso de Design.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

During the last decade, wind power generation has seen rapid development. According to the U.S. Department of Energy, achieving 20\% wind power penetration in the U.S. by 2030 will require: (i) enhancement of the transmission infrastructure, (ii) improvement of reliability and operability of wind systems and (iii) increased U.S. manufacturing capacity of wind generation equipment. This research will concentrate on improvement of reliability and operability of wind energy conversion systems (WECSs). The increased penetration of wind energy into the grid imposes new operating conditions on power systems. This change requires development of an adequate reliability framework. This thesis proposes a framework for assessing WECS reliability in the face of external disturbances, e.g., grid faults and internal component faults. The framework is illustrated using a detailed model of type C WECS - doubly fed induction generator with corresponding deterministic and random variables in a simplified grid model. Fault parameters and performance requirements essential to reliability measurements are included in the simulation. The proposed framework allows a quantitative analysis of WECS designs; analysis of WECS control schemes, e.g., fault ride-through mechanisms; discovery of key parameters that influence overall WECS reliability; and computation of WECS reliability with respect to different grid codes/performance requirements.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

The last couple of decades have been the stage for the introduction of new telecommunication networks. It is expected that in the future all types of vehicles, such as cars, buses and trucks have the ability to intercommunicate and form a vehicular network. Vehicular networks display particularities when compared to other networks due to their continuous node mobility and their wide geographical dispersion, leading to a permanent network fragmentation. Therefore, the main challenges that this type of network entails relate to the intermittent connectivity and the long and variable delay in information delivery. To address the problems related to the intermittent connectivity, a new concept was introduced – Delay Tolerant Network (DTN). This architecture is built on a Store-Carry-and-Forward (SCF) mechanism in order to assure the delivery of information when there is no end-to-end path defined. Vehicular networks support a multiplicity of services, including the transportation of non-urgent information. Therefore, it is possible to conclude that the use of a DTN for the dissemination of non-urgent information is able to surpass the aforementioned challenges. The work developed focused on the use of DTNs for the dissemination of non-urgent information. This information is originated in the network service provider and should be available on mobile network terminals during a limited period of time. In order to do so, four different strategies were deployed: Random, Least Number of Hops First (LNHF), Local Rarest Bundle First (LRBF) e Local Rarest Generation First (LRGF). All of these strategies have a common goal: to disseminate content into the network in the shortest period of time and minimizing network congestion. This work also contemplates the analysis and implementation of techniques that reduce network congestion. The design, implementation and validation of the proposed strategies was divided into three stages. The first stage focused on creating a Matlab emulator for the fast implementation and strategy validation. This stage resulted in the four strategies that were afterwards implemented in the DTNs software Helix – developed in a partnership between Instituto de Telecomunicac¸˜oes (IT) and Veniam R , which are responsible for the largest operating vehicular network worldwide that is located in Oporto city. The strategies were later evaluated on an emulator that was built for the largescale testing of DTN. Both emulators account for vehicular mobility based on information previously collected from the real platform. Finally, the strategy that presented the best overall performance was tested on a real platform – in a lab environment – for concept and operability demonstration. It is possible to conclude that two of the implemented strategies (LRBF and LRGF) can be deployed in the real network and guarantee a significant delivery rate. The LRBF strategy has the best performance in terms of delivery. However, it needs to add a significant overhead to the network in order to work. In the future, tests of scalability should be conducted in a real environment in order to confirm the emulator results. The real implementation of the strategies should be accompanied by the introduction of new types of services for content distribution.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

[ES]Este Trabajo de Fin de Grado (TFG) está relacionado con las prácticas efectuadas en la Guardia Municipal de San Sebastián (País Vasco) y pretende analizar las órdenes de protección concedidas a víctimas de violencia de género inmigrantes y nacionales para comprobar si existe alguna diferencia en cuanto a su aplicación. Asimismo, se pretende analizar el procedimiento efectuado por este cuerpo policial y servicios sociales en estos casos, así como el perfil de la víctima y el agresor. Por otra parte, se procura analizar aspectos controvertidos de la Ley Orgánica 1/2004 de Medidas de Protección Integral de la Violencia de Género. Para ello, se utilizara una metodología mixta. Por una parte de corte cualitativo realizando cuatro entrevistas (a dos víctimas, una agente de policía y una trabajadora social) para ahondar más respecto a este tema. En segundo lugar, desde una perspectiva cuantitativa se explorará la base de datos de la Guardia Municipal en materia de violencia de género para realizar un análisis estadístico. Finalmente, se abordaran las conclusiones a las que se ha llegado con este trabajo y se propondrán mejoras de cara a futuras investigaciones y a la operatividad de la Guardia Municipal.

Relevância:

10.00% 10.00%

Publicador:

Resumo:

A contratação pública, enquanto universo de formação dos contratos públicos, ao ser abordada numa perspetiva adequada pode ser considerada como um mecanismo para alcançar um crescimento inteligente e sustentável e, para assegurar, simultaneamente, uma utilização eficiente dos fundos públicos. A aplicação de um Sistema de controlo interno, para além de ser um instrumento de auxílio na tomada de decisão, permite incrementar a eficácia e eficiência das operações, uma maior fiabilidade da informação financeira e a garantia do cumprimento dos diplomas legais que a estas se aplicam. Neste sentido, pelo facto de um valor considerável do Orçamento de Estado atribuído à Marinha estar alocado à atividade de manutenção, cuja entidade responsável é a Direção de Navios, é de todo o interesse criar ferramentas que permitam uma melhor gestão dos fundos públicos, bem como, garantir a máxima operacionalidade dos meios que concorrem para o cumprimento da missão da Marinha. O presente estudo tem como objetivos identificar quais os elementos relativos à formação e execução dos contratos públicos que podem ser melhorados de modo a incrementar o controlo interno no âmbito da atividade procedimental referente à aquisição, construção e reparação de meios navais e unidades de apoio à Marinha. De forma a alcançar este desiderato utilizou-se o método do ´estudo de caso´, servindo a Direção de Navios como unidade de estudo, conjugando diferentes técnicas de recolha de dados. Com este estudo conclui-se que é necessário incrementar o controlo interno nesta unidade. De modo a colmatar esta problemática, foi elaborado um manual de controlo interno que tem por principal finalidade garantir um elevado desempenho na área da contratação pública e a salvaguarda do interesse público, se implementado por esta unidade.