914 resultados para Risk, Process, Systems, Value, Enterprise
Resumo:
Fault resistance is a critical component of electric power systems operation due to its stochastic nature. If not considered, this parameter may interfere in fault analysis studies. This paper presents an iterative fault analysis algorithm for unbalanced three-phase distribution systems that considers a fault resistance estimate. The proposed algorithm is composed by two sub-routines, namely the fault resistance and the bus impedance. The fault resistance sub-routine, based on local fault records, estimates the fault resistance. The bus impedance sub-routine, based on the previously estimated fault resistance, estimates the system voltages and currents. Numeric simulations on the IEEE 37-bus distribution system demonstrate the algorithm`s robustness and potential for offline applications, providing additional fault information to Distribution Operation Centers and enhancing the system restoration process. (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
This letter presents an extension of an existing ground distance relay algorithm to include phase distance relays. The algorithm uses a fault resistance estimation process in the phase domain, improving efficiency in the distance protection process. The results show that the algorithm is suitable for online applications, and that it has an independent performance from the fault resistance magnitude, the fault location, and the line asymmetry.
Resumo:
Distributed control systems consist of sensors, actuators and controllers, interconnected by communication networks and are characterized by a high number of concurrent process. This work presents a proposal for a procedure to model and analyze communication networks for distributed control systems in intelligent building. The approach considered for this purpose is based on the characterization of the control system as a discrete event system and application of coloured Petri net as a formal method for specification, analysis and verification of control solutions. With this approach, we develop the models that compose the communication networks for the control systems of intelligent building, which are considered the relationships between the various buildings systems. This procedure provides a structured development of models, facilitating the process of specifying the control algorithm. An application example is presented in order to illustrate the main features of this approach.
Resumo:
The present paper proposes a flexible consensus scheme for group decision making, which allows one to obtain a consistent collective opinion, from information provided by each expert in terms of multigranular fuzzy estimates. It is based on a linguistic hierarchical model with multigranular sets of linguistic terms, and the choice of the most suitable set is a prerogative of each expert. From the human viewpoint, using such model is advantageous, since it permits each expert to utilize linguistic terms that reflect more adequately the level of uncertainty intrinsic to his evaluation. From the operational viewpoint, the advantage of using such model lies in the fact that it allows one to express the linguistic information in a unique domain, without losses of information, during the discussion process. The proposed consensus scheme supposes that the moderator can interfere in the discussion process in different ways. The intervention can be a request to any expert to update his opinion or can be the adjustment of the weight of each expert`s opinion. An optimal adjustment can be achieved through the execution of an optimization procedure that searches for the weights that maximize a corresponding soft consensus index. In order to demonstrate the usefulness of the presented consensus scheme, a technique for multicriteria analysis, based on fuzzy preference relation modeling, is utilized for solving a hypothetical enterprise strategy planning problem, generated with the use of the Balanced Scorecard methodology. (C) 2009 Elsevier Inc. All rights reserved.
Resumo:
Ecological niche modelling combines species occurrence points with environmental raster layers in order to obtain models for describing the probabilistic distribution of species. The process to generate an ecological niche model is complex. It requires dealing with a large amount of data, use of different software packages for data conversion, for model generation and for different types of processing and analyses, among other functionalities. A software platform that integrates all requirements under a single and seamless interface would be very helpful for users. Furthermore, since biodiversity modelling is constantly evolving, new requirements are constantly being added in terms of functions, algorithms and data formats. This evolution must be accompanied by any software intended to be used in this area. In this scenario, a Service-Oriented Architecture (SOA) is an appropriate choice for designing such systems. According to SOA best practices and methodologies, the design of a reference business process must be performed prior to the architecture definition. The purpose is to understand the complexities of the process (business process in this context refers to the ecological niche modelling problem) and to design an architecture able to offer a comprehensive solution, called a reference architecture, that can be further detailed when implementing specific systems. This paper presents a reference business process for ecological niche modelling, as part of a major work focused on the definition of a reference architecture based on SOA concepts that will be used to evolve the openModeller software package for species modelling. The basic steps that are performed while developing a model are described, highlighting important aspects, based on the knowledge of modelling experts. In order to illustrate the steps defined for the process, an experiment was developed, modelling the distribution of Ouratea spectabilis (Mart.) Engl. (Ochnaceae) using openModeller. As a consequence of the knowledge gained with this work, many desirable improvements on the modelling software packages have been identified and are presented. Also, a discussion on the potential for large-scale experimentation in ecological niche modelling is provided, highlighting opportunities for research. The results obtained are very important for those involved in the development of modelling tools and systems, for requirement analysis and to provide insight on new features and trends for this category of systems. They can also be very helpful for beginners in modelling research, who can use the process and the experiment example as a guide to this complex activity. (c) 2008 Elsevier B.V. All rights reserved.
Resumo:
This article presents a tool for the allocation analysis of complex systems of water resources, called AcquaNetXL, developed in the form of spreadsheet in which a model of linear optimization and another nonlinear were incorporated. The AcquaNetXL keeps the concepts and attributes of a decision support system. In other words, it straightens out the communication between the user and the computer, facilitates the understanding and the formulation of the problem, the interpretation of the results and it also gives a support in the process of decision making, turning it into a clear and organized process. The performance of the algorithms used for solving the problems of water allocation was satisfactory especially for the linear model.
Resumo:
Tropical countries, such as Brazil and Colombia, have the possibility of using agricultural lands for growing biomass to produce bio-fuels such as biodiesel and ethanol. This study applies an energy analysis to the production process of anhydrous ethanol obtained from the hydrolysis of starch and cellulosic and hemicellulosic material present in the banana fruit and its residual biomass. Four different production routes were analyzed: acid hydrolysis of amylaceous material (banana pulp and banana fruit) and enzymatic hydrolysis of lignocellulosic material (flower stalk and banana skin). The analysis considered banana plant cultivation, feedstock transport, hydrolysis, fermentation, distillation, dehydration, residue treatment and utility plant. The best indexes were obtained for amylaceous material for which mass performance varied from 346.5 L/t to 388.7 L/t, Net Energy Value (NEV) ranged from 9.86 MJ/L to 9.94 MJ/L and the energy ratio was 1.9 MJ/MJ. For lignocellulosic materials, the figures were less favorable: mass performance varied from 86.1 to 123.5 L/t, NEV from 5.24 10 8.79 MJ/L and energy ratio from 1.3 to 1.6 MJ/MJ. The analysis showed, however, that both processes can be considered energetically feasible. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
Safety Instrumented Systems (SIS) are designed to prevent and / or mitigate accidents, avoiding undesirable high potential risk scenarios, assuring protection of people`s health, protecting the environment and saving costs of industrial equipment. The design of these systems require formal methods for ensuring the safety requirements, but according material published in this area, has not identified a consolidated procedure to match the task. This sense, this article introduces a formal method for diagnosis and treatment of critical faults based on Bayesian network (BN) and Petri net (PN). This approach considers diagnosis and treatment for each safety instrumented function (SIF) including hazard and operability (HAZOP) study in the equipment or system under control. It also uses BN and Behavioral Petri net (BPN) for diagnoses and decision-making and the PN for the synthesis, modeling and control to be implemented by Safety Programmable Logic Controller (PLC). An application example considering the diagnosis and treatment of critical faults is presented and illustrates the methodology proposed.
Resumo:
This work presents a mathematical model for the vinyl acetate and n-butyl acrylate emulsion copolymerization process in batch reactors. The model is able to explain the effects of simultaneous changes in emulsifier concentration, initiator concentration, monomer-to-water ratio, and monomer feed composition on monomer conversion, copolymer composition and, to lesser extent, average particle size evolution histories. The main features of the system, such as the increase in the rate of polymerization as temperature, emulsifier, and initiator concentrations increase are correctly represented by the model. The model accounts for the basic features of the process and may be useful for practical applications, despite its simplicity and a reduced number of adjustable parameters.
Resumo:
This paper concern the development of a stable model predictive controller (MPC) to be integrated with real time optimization (RTO) in the control structure of a process system with stable and integrating outputs. The real time process optimizer produces Optimal targets for the system inputs and for Outputs that Should be dynamically implemented by the MPC controller. This paper is based oil a previous work (Comput. Chem. Eng. 2005, 29, 1089) where a nominally stable MPC was proposed for systems with the conventional control approach where only the outputs have set points. This work is also based oil the work of Gonzalez et at. (J. Process Control 2009, 19, 110) where the zone control of stable systems is studied. The new control for is obtained by defining ail extended control objective that includes input targets and zone controller the outputs. Additional decision variables are also defined to increase the set of feasible solutions to the control problem. The hard constraints resulting from the cancellation of the integrating modes Lit the end of the control horizon are softened,, and the resulting control problem is made feasible to a large class of unknown disturbances and changes of the optimizing targets. The methods are illustrated with the simulated application of the proposed,approaches to a distillation column of the oil refining industry.
Resumo:
Cooling towers are widely used in many industrial and utility plants as a cooling medium, whose thermal performance is of vital importance. Despite the wide interest in cooling tower design, rating and its importance in energy conservation, there are few investigations concerning the integrated analysis of cooling systems. This work presents an approach for the systemic performance analysis of a cooling water system. The approach combines experimental design with mathematical modeling. An experimental investigation was carried out to characterize the mass transfer in the packing of the cooling tower as a function of the liquid and gas flow rates, whose results were within the range of the measurement accuracy. Then, an integrated model was developed that relies on the mass and heat transfer of the cooling tower, as well as on the hydraulic and thermal interactions with a heat exchanger network. The integrated model for the cooling water system was simulated and the temperature results agree with the experimental data of the real operation of the pilot plant. A case study illustrates the interaction in the system and the need for a systemic analysis of cooling water system. The proposed mathematical and experimental analysis should be useful for performance analysis of real-world cooling water systems. (C) 2009 Elsevier Ltd. All rights reserved.
Resumo:
Several MPC applications implement a control strategy in which some of the system outputs are controlled within specified ranges or zones, rather than at fixed set points [J.M. Maciejowski, Predictive Control with Constraints, Prentice Hall, New Jersey, 2002]. This means that these outputs will be treated as controlled variables only when the predicted future values lie outside the boundary of their corresponding zones. The zone control is usually implemented by selecting an appropriate weighting matrix for the output error in the control cost function. When an output prediction is inside its zone, the corresponding weight is zeroed, so that the controller ignores this output. When the output prediction lies outside the zone, the error weight is made equal to a specified value and the distance between the output prediction and the boundary of the zone is minimized. The main problem of this approach, as long as stability of the closed loop is concerned, is that each time an output is switched from the status of non-controlled to the status of controlled, or vice versa, a different linear controller is activated. Thus, throughout the continuous operation of the process, the control system keeps switching from one controller to another. Even if a stabilizing control law is developed for each of the control configurations, switching among stable controllers not necessarily produces a stable closed loop system. Here, a stable M PC is developed for the zone control of open-loop stable systems. Focusing on the practical application of the proposed controller, it is assumed that in the control structure of the process system there is an upper optimization layer that defines optimal targets to the system inputs. The performance of the proposed strategy is illustrated by simulation of a subsystem of an industrial FCC system. (C) 2008 Elsevier Ltd. All rights reserved.
Resumo:
The procedure for online process control by attributes consists of inspecting a single item at every m produced items. It is decided on the basis of the inspection result whether the process is in-control (the conforming fraction is stable) or out-of-control (the conforming fraction is decreased, for example). Most articles about online process control have cited the stoppage of the production process for an adjustment when the inspected item is non-conforming (then the production is restarted in-control, here denominated as corrective adjustment). Moreover, the articles related to this subject do not present semi-economical designs (which may yield high quantities of non-conforming items), as they do not include a policy of preventive adjustments (in such case no item is inspected), which can be more economical, mainly if the inspected item can be misclassified. In this article, the possibility of preventive or corrective adjustments in the process is decided at every m produced item. If a preventive adjustment is decided upon, then no item is inspected. On the contrary, the m-th item is inspected; if it conforms, the production goes on, otherwise, an adjustment takes place and the process restarts in-control. This approach is economically feasible for some practical situations and the parameters of the proposed procedure are determined minimizing an average cost function subject to some statistical restrictions (for example, to assure a minimal levelfixed in advanceof conforming items in the production process). Numerical examples illustrate the proposal.
Resumo:
Among several process variability sources, valve friction and inadequate controller tuning are supposed to be two of the most prevalent. Friction quantification methods can be applied to the development of model-based compensators or to diagnose valves that need repair, whereas accurate process models can be used in controller retuning. This paper extends existing methods that jointly estimate the friction and process parameters, so that a nonlinear structure is adopted to represent the process model. The developed estimation algorithm is tested with three different data sources: a simulated first order plus dead time process, a hybrid setup (composed of a real valve and a simulated pH neutralization process) and from three industrial datasets corresponding to real control loops. The results demonstrate that the friction is accurately quantified, as well as ""good"" process models are estimated in several situations. Furthermore, when a nonlinear process model is considered, the proposed extension presents significant advantages: (i) greater accuracy for friction quantification and (ii) reasonable estimates of the nonlinear steady-state characteristics of the process. (C) 2010 Elsevier Ltd. All rights reserved.
Resumo:
In this article, we consider the stochastic optimal control problem of discrete-time linear systems subject to Markov jumps and multiplicative noise under three kinds of performance criterions related to the final value of the expectation and variance of the output. In the first problem it is desired to minimise the final variance of the output subject to a restriction on its final expectation, in the second one it is desired to maximise the final expectation of the output subject to a restriction on its final variance, and in the third one it is considered a performance criterion composed by a linear combination of the final variance and expectation of the output of the system. We present explicit sufficient conditions for the existence of an optimal control strategy for these problems, generalising previous results in the literature. We conclude this article presenting a numerical example of an asset liabilities management model for pension funds with regime switching.