557 resultados para Dynamic security assessment

em Queensland University of Technology - ePrints Archive


Relevância:

100.00% 100.00%

Publicador:

Resumo:

Power system dynamic analysis and security assessment are becoming more significant today due to increases in size and complexity from restructuring, emerging new uncertainties, integration of renewable energy sources, distributed generation, and micro grids. Precise modelling of all contributed elements/devices, understanding interactions in detail, and observing hidden dynamics using existing analysis tools/theorems are difficult, and even impossible. In this chapter, the power system is considered as a continuum and the propagated electomechanical waves initiated by faults and other random events are studied to provide a new scheme for stability investigation of a large dimensional system. For this purpose, the measured electrical indices (such as rotor angle and bus voltage) following a fault in different points among the network are used, and the behaviour of the propagated waves through the lines, nodes, and buses is analyzed. The impact of weak transmission links on a progressive electromechanical wave using energy function concept is addressed. It is also emphasized that determining severity of a disturbance/contingency accurately, without considering the related electromechanical waves, hidden dynamics, and their properties is not secure enough. Considering these phenomena takes heavy and time consuming calculation, which is not suitable for online stability assessment problems. However, using a continuum model for a power system reduces the burden of complex calculations

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We present a hierarchical model for assessing an object-oriented program's security. Security is quantified using structural properties of the program code to identify the ways in which `classified' data values may be transferred between objects. The model begins with a set of low-level security metrics based on traditional design characteristics of object-oriented classes, such as data encapsulation, cohesion and coupling. These metrics are then used to characterise higher-level properties concerning the overall readability and writability of classified data throughout the program. In turn, these metrics are then mapped to well-known security design principles such as `assigning the least privilege' and `reducing the size of the attack surface'. Finally, the entire program's security is summarised as a single security index value. These metrics allow different versions of the same program, or different programs intended to perform the same task, to be compared for their relative security at a number of different abstraction levels. The model is validated via an experiment involving five open source Java programs, using a static analysis tool we have developed to automatically extract the security metrics from compiled Java bytecode.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Refactoring is a common approach to producing better quality software. Its impact on many software quality properties, including reusability, maintainability and performance, has been studied and measured extensively. However, its impact on the information security of programs has received relatively little attention. In this work, we assess the impact of a number of the most common code-level refactoring rules on data security, using security metrics that are capable of measuring security from the viewpoint of potential information flow. The metrics are calculated for a given Java program using a static analysis tool we have developed to automatically analyse compiled Java bytecode. We ran our Java code analyser on various programs which were refactored according to each rule. New values of the metrics for the refactored programs then confirmed that the code changes had a measurable effect on information security.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Effective enterprise information security policy management requires review and assessment activities to ensure information security policies are aligned with business goals and objectives. As security policy management involves the elements of policy development process and the security policy as output, the context for security policy assessment requires goal-based metrics for these two elements. However, the current security management assessment methods only provide checklist types of assessment that are predefined by industry best practices and do not allow for developing specific goal-based metrics. Utilizing theories drawn from literature, this paper proposes the Enterprise Information Security Policy Assessment approach that expands on the Goal-Question-Metric (GQM) approach. The proposed assessment approach is then applied in a case scenario example to illustrate a practical application. It is shown that the proposed framework addresses the requirement for developing assessment metrics and allows for the concurrent undertaking of process-based and product-based assessment. Recommendations for further research activities include the conduct of empirical research to validate the propositions and the practical application of the proposed assessment approach in case studies to provide opportunities to introduce further enhancements to the approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

To harness safe operation of Web-based systems in Web environments, we propose an SSPA (Server-based SHA-1 Page-digest Algorithm) to verify the integrity of Web contents before the server issues an HTTP response to a user request. In addition to standard security measures, our Java implementation of the SSPA, which is called the Dynamic Security Surveillance Agent (DSSA), provides further security in terms of content integrity to Web-based systems. Its function is to prevent the display of Web contents that have been altered through the malicious acts of attackers and intruders on client machines. This is to protect the reputation of organisations from cyber-attacks and to ensure the safe operation of Web systems by dynamically monitoring the integrity of a Web site's content on demand. We discuss our findings in terms of the applicability and practicality of the proposed system. We also discuss its time metrics, specifically in relation to its computational overhead at the Web server, as well as the overall latency from the clients' point of view, using different Internet access methods. The SSPA, our DSSA implementation, some experimental results and related work are all discussed

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Load modelling plays an important role in power system dynamic stability assessment. One of the widely used methods in assessing load model impact on system dynamic response is parametric sensitivity analysis. A composite load model-based load sensitivity analysis framework is proposed. It enables comprehensive investigation into load modelling impacts on system stability considering the dynamic interactions between load and system dynamics. The effect of the location of individual as well as patches of composite loads in the vicinity on the sensitivity of the oscillatory modes is investigated. The impact of load composition on the overall sensitivity of the load is also investigated.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Load modeling plays an important role in power system dynamic stability assessment. One of the widely used methods in assessing load model impact on system dynamic response is through parametric sensitivity analysis. Load ranking provides an effective measure of such impact. Traditionally, load ranking is based on either static or dynamic load model alone. In this paper, composite load model based load ranking framework is proposed. It enables comprehensive investigation into load modeling impacts on system stability considering the dynamic interactions between load and system dynamics. The impact of load composition on the overall sensitivity and therefore on ranking of the load is also investigated. Dynamic simulations are performed to further elucidate the results obtained through sensitivity based load ranking approach.

Relevância:

40.00% 40.00%

Publicador:

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A new method for noninvasive assessment of tear film surface quality (TFSQ) is proposed. The method is based on high-speed videokeratoscopy in which the corneal area for the analysis is dynamically estimated in a manner that removes videokeratoscopy interference from the shadows of eyelashes but not that related to the poor quality of the precorneal tear film that is of interest. The separation between the two types of seemingly similar videokeratoscopy interference is achieved by region-based classification in which the overall noise is first separated from the useful signal (unaltered videokeratoscopy pattern), followed by a dedicated interference classification algorithm that distinguishes between the two considered interferences. The proposed technique provides a much wider corneal area for the analysis of TFSQ than the previously reported techniques. A preliminary study with the proposed technique, carried out for a range of anterior eye conditions, showed an effective behavior in terms of noise to signal separation, interference classification, as well as consistent TFSQ results. Subsequently, the method proved to be able to not only discriminate between the bare eye and the lens on eye conditions but also to have the potential to discriminate between the two types of contact lenses.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

The international focus on embracing daylighting for energy efficient lighting purposes and the corporate sector’s indulgence in the perception of workplace and work practice “transparency” has spurned an increase in highly glazed commercial buildings. This in turn has renewed issues of visual comfort and daylight-derived glare for occupants. In order to ascertain evidence, or predict risk, of these events; appraisals of these complex visual environments require detailed information on the luminances present in an occupant’s field of view. Conventional luminance meters are an expensive and time consuming method of achieving these results. To create a luminance map of an occupant’s visual field using such a meter requires too many individual measurements to be a practical measurement technique. The application of digital cameras as luminance measurement devices has solved this problem. With high dynamic range imaging, a single digital image can be created to provide luminances on a pixel-by-pixel level within the broad field of view afforded by a fish-eye lens: virtually replicating an occupant’s visual field and providing rapid yet detailed luminance information for the entire scene. With proper calibration, relatively inexpensive digital cameras can be successfully applied to the task of luminance measurements, placing them in the realm of tools that any lighting professional should own. This paper discusses how a digital camera can become a luminance measurement device and then presents an analysis of results obtained from post occupancy measurements from building assessments conducted by the Mobile Architecture Built Environment Laboratory (MABEL) project. This discussion leads to the important realisation that the placement of such tools in the hands of lighting professionals internationally will provide new opportunities for the lighting community in terms of research on critical issues in lighting such as daylight glare and visual quality and comfort.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

There are several noninvasive techniques for assessing the kinetics of tear film, but no comparative studies have been conducted to evaluate their efficacies. Our aim is to test and compare techniques based on high-speed videokeratoscopy (HSV), dynamic wavefront sensing (DWS), and lateral shearing interferometry (LSI). Algorithms are developed to estimate the tear film build-up time TBLD, and the average tear film surface quality in the stable phase of the interblink interval TFSQAv. Moderate but significant correlations are found between TBLD measured with LSI and DWS based on vertical coma (Pearson's r2=0.34, p<0.01) and higher order rms (r2=0.31, p<0.01), as well as between TFSQAv measured with LSI and HSV (r2=0.35, p<0.01), and between LSI and DWS based on the rms fit error (r2=0.40, p<0.01). No significant correlation is found between HSV and DWS. All three techniques estimate tear film build-up time to be below 2.5 sec, and they achieve a remarkably close median value of 0.7 sec. HSV appears to be the most precise method for measuring tear film surface quality. LSI appears to be the most sensitive method for analyzing tear film build-up.