905 resultados para Dynamic security assessment
Resumo:
The main objective of on-line dynamic security assessment is to take preventive action if required or decide remedial action if a contingency actually occurs. Stability limits are obtained for different contingencies. The mode of instability is one of the outputs of dynamic security analysis. When a power system becomes unstable, it splits initially into two groups of generators, and there is a unique cutset in the transmission network known as critical cutset across which the angles become unbounded. The knowledge of critical cutset is additional information obtained from dynamic security assessment, which can be used for initiating preventive control actions, deciding emergency control actions, and adaptive out-of-step relaying. In this article, an analytical technique for the fast prediction of the critical cutset by system simulation for a short duration is presented. Case studies on the New England ten-generator system are presented. The article also suggests the applications of the identification of critical cutsets.
Resumo:
An application of direct methods to dynamic security assessment of power systems using structure-preserving energy functions (SPEF) is presented. The transient energy margin (TEM) is used as an index for checking the stability of the system as well as ranking the contigencies based on their severity. The computation of the TEM requires the evaluation of the critical energy and the energy at fault clearing. Usually this is done by simulating the faulted trajectory, which is time-consuming. In this paper, a new algorithm which eliminates the faulted trajectory estimation is presented to calculate the TEM. The system equations and the SPEF are developed using the centre-of-inertia (COI) formulation and the loads are modelled as arbitrary functions of the respective bus voltages. The critical energy is evaluated using the potential energy boundary surface (PEBS) method. The method is illustrated by considering two realistic power system examples.
Resumo:
Power system dynamic analysis and security assessment are becoming more significant today due to increases in size and complexity from restructuring, emerging new uncertainties, integration of renewable energy sources, distributed generation, and micro grids. Precise modelling of all contributed elements/devices, understanding interactions in detail, and observing hidden dynamics using existing analysis tools/theorems are difficult, and even impossible. In this chapter, the power system is considered as a continuum and the propagated electomechanical waves initiated by faults and other random events are studied to provide a new scheme for stability investigation of a large dimensional system. For this purpose, the measured electrical indices (such as rotor angle and bus voltage) following a fault in different points among the network are used, and the behaviour of the propagated waves through the lines, nodes, and buses is analyzed. The impact of weak transmission links on a progressive electromechanical wave using energy function concept is addressed. It is also emphasized that determining severity of a disturbance/contingency accurately, without considering the related electromechanical waves, hidden dynamics, and their properties is not secure enough. Considering these phenomena takes heavy and time consuming calculation, which is not suitable for online stability assessment problems. However, using a continuum model for a power system reduces the burden of complex calculations
Resumo:
We present a hierarchical model for assessing an object-oriented program's security. Security is quantified using structural properties of the program code to identify the ways in which `classified' data values may be transferred between objects. The model begins with a set of low-level security metrics based on traditional design characteristics of object-oriented classes, such as data encapsulation, cohesion and coupling. These metrics are then used to characterise higher-level properties concerning the overall readability and writability of classified data throughout the program. In turn, these metrics are then mapped to well-known security design principles such as `assigning the least privilege' and `reducing the size of the attack surface'. Finally, the entire program's security is summarised as a single security index value. These metrics allow different versions of the same program, or different programs intended to perform the same task, to be compared for their relative security at a number of different abstraction levels. The model is validated via an experiment involving five open source Java programs, using a static analysis tool we have developed to automatically extract the security metrics from compiled Java bytecode.
Resumo:
Refactoring is a common approach to producing better quality software. Its impact on many software quality properties, including reusability, maintainability and performance, has been studied and measured extensively. However, its impact on the information security of programs has received relatively little attention. In this work, we assess the impact of a number of the most common code-level refactoring rules on data security, using security metrics that are capable of measuring security from the viewpoint of potential information flow. The metrics are calculated for a given Java program using a static analysis tool we have developed to automatically analyse compiled Java bytecode. We ran our Java code analyser on various programs which were refactored according to each rule. New values of the metrics for the refactored programs then confirmed that the code changes had a measurable effect on information security.
Resumo:
"May 2001."
Resumo:
This paper deals with hybrid method for transient stability analysis combining time domain simulation and a direct method. Nowadays, the step-by-step simulation is the best available tool for allowing the uses of detailed models and for providing reliable results. The main limitation of this approach involves the large time of computational simulations and the absence of stability margin. On the other hand, direct methods, that demand less CPU time, did not show ample reliability and applicability yet. The best way seems to be using hybrid solutions, in which a direct method is incorporated in a time domain simulation tool. This work has studied a direct method using the transient potential and kinetic energy of the critical machine only. In this paper the critical machine is identified by a fast and efficient method, and the proposal is new for using to get stability margins from hybrid approaches. Results from systems, like 16-machine, show stability indices to dynamic security assessment. © 2001 IEEE.
Resumo:
Indices that report how much a contingency is stable or unstable in an electrical power system have been the object of several studies in the last decades. In some approaches, indices are obtained from time-domain simulation; others explore the calculation of the stability margin from the so-called direct methods, or even by neural networks.The goal is always to obtain a fast and reliable way of analysing large disturbance that might occur on the power systems. A fast classification in stable and unstable, as a function of transient stability is crucial for a dynamic security analysis. All good propositions as how to analyse contingencies must present some important features: classification of contingencies; precision and reliability; and efficiency computation. Indices obtained from time-domain simulations have been used to classify the contingencies as stable or unstable. These indices are based on the concepts of coherence, transient energy conversion between kinetic energy and potential energy, and three dot products of state variable. The classification of the contingencies using the indices individually is not reliable, since the performance of these indices varies with each simulated condition. However, collapsing these indices into a single one can improve the analysis significantly. In this paper, it is presented the results of an approach to filter the contingencies, by a simple classification of them into stable, unstable or marginal. This classification is performed from the composite indices obtained from step by step simulation with a time period of the clearing time plus 0.5 second. The contingencies originally classified as stable or unstable do not require this extra simulation. The methodology requires an initial effort to obtain the values of the intervals for classification, and the weights. This is performed once for each power system and can be used in different operating conditions and for different contingencies. No misplaced classification o- - ccurred in any of the tests, i.e., we detected no stable case classified as unstable or otherwise. The methodology is thus well fitted for it allows for a rapid conclusion about the stability of th system, for the majority of the contingencies (Stable or Unstable Cases). The tests, results and discussions are presented using two power systems: (1) the IEEE17 system, composed of 17 generators, 162 buses and 284 transmission lines; and (2) a South Brazilian system configuration, with 10 generators, 45 buses and 71 lines.
Resumo:
Esse trabalho compara os algoritmos C4.5 e MLP (do inglês “Multilayer Perceptron”) aplicados a avaliação de segurança dinâmica ou (DSA, do inglês “Dynamic Security Assessment”) e em projetos de controle preventivo, com foco na estabilidade transitória de sistemas elétricos de potência (SEPs). O C4.5 é um dos algoritmos da árvore de decisão ou (DT, do inglês “Decision Tree”) e a MLP é um dos membros da família das redes neurais artificiais (RNA). Ambos os algoritmos fornecem soluções para o problema da DSA em tempo real, identificando rapidamente quando um SEP está sujeito a uma perturbação crítica (curto-circuito, por exemplo) que pode levar para a instabilidade transitória. Além disso, o conhecimento obtido de ambas as técnicas, na forma de regras, pode ser utilizado em projetos de controle preventivo para restaurar a segurança do SEP contra perturbações críticas. Baseado na formação de base de dados com exaustivas simulações no domínio do tempo, algumas perturbações críticas específicas são tomadas como exemplo para comparar os algoritmos C4.5 e MLP empregadas a DSA e ao auxílio de ações preventivas. O estudo comparativo é testado no sistema elétrico “New England”. Nos estudos de caso, a base de dados é gerada por meio do programa PSTv3 (“Power System Toolbox”). As DTs e as RNAs são treinada e testadas usando o programa Rapidminer. Os resultados obtidos demonstram que os algoritmos C4.5 e MLP são promissores nas aplicações de DSA e em projetos de controle preventivo.
Resumo:
As técnicas utilizadas para avaliação da segurança estática em sistemas elétricos de potência dependem da execução de grande número de casos de fluxo de carga para diversas topologias e condições operacionais do sistema. Em ambientes de operação de tempo real, esta prática é de difícil realização, principalmente em sistemas de grande porte onde a execução de todos os casos de fluxo de carga que são necessários, exige elevado tempo e esforço computacional mesmo para os recursos atuais disponíveis. Técnicas de mineração de dados como árvore de decisão estão sendo utilizadas nos últimos anos e tem alcançado bons resultados nas aplicações de avaliação da segurança estática e dinâmica de sistemas elétricos de potência. Este trabalho apresenta uma metodologia para avaliação da segurança estática em tempo real de sistemas elétricos de potência utilizando árvore de decisão, onde a partir de simulações off-line de fluxo de carga, executadas via software Anarede (CEPEL), foi gerada uma extensa base de dados rotulada relacionada ao estado do sistema, para diversas condições operacionais. Esta base de dados foi utilizada para indução das árvores de decisão, fornecendo um modelo de predição rápida e precisa que classifica o estado do sistema (seguro ou inseguro) para aplicação em tempo real. Esta metodologia reduz o uso de computadores no ambiente on-line, uma vez que o processamento das árvores de decisão exigem apenas a verificação de algumas instruções lógicas do tipo if-then, de um número reduzido de testes numéricos nos nós binários para definição do valor do atributo que satisfaz as regras, pois estes testes são realizados em quantidade igual ao número de níveis hierárquicos da árvore de decisão, o que normalmente é reduzido. Com este processamento computacional simples, a tarefa de avaliação da segurança estática poderá ser executada em uma fração do tempo necessário para a realização pelos métodos tradicionais mais rápidos. Para validação da metodologia, foi realizado um estudo de caso baseado em um sistema elétrico real, onde para cada contingência classificada como inseguro, uma ação de controle corretivo é executada, a partir da informação da árvore de decisão sobre o atributo crítico que mais afeta a segurança. Os resultados mostraram ser a metodologia uma importante ferramenta para avaliação da segurança estática em tempo real para uso em um centro de operação do sistema.
Resumo:
Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.
Resumo:
To harness safe operation of Web-based systems in Web environments, we propose an SSPA (Server-based SHA-1 Page-digest Algorithm) to verify the integrity of Web contents before the server issues an HTTP response to a user request. In addition to standard security measures, our Java implementation of the SSPA, which is called the Dynamic Security Surveillance Agent (DSSA), provides further security in terms of content integrity to Web-based systems. Its function is to prevent the display of Web contents that have been altered through the malicious acts of attackers and intruders on client machines. This is to protect the reputation of organisations from cyber-attacks and to ensure the safe operation of Web systems by dynamically monitoring the integrity of a Web site's content on demand. We discuss our findings in terms of the applicability and practicality of the proposed system. We also discuss its time metrics, specifically in relation to its computational overhead at the Web server, as well as the overall latency from the clients' point of view, using different Internet access methods. The SSPA, our DSSA implementation, some experimental results and related work are all discussed