107 resultados para system analysis
em University of Queensland eSpace - Australia
Resumo:
Power system real time security assessment is one of the fundamental modules of the electricity markets. Typically, when a contingency occurs, it is required that security assessment and enhancement module shall be ready for action within about 20 minutes’ time to meet the real time requirement. The recent California black out again highlighted the importance of system security. This paper proposed an approach for power system security assessment and enhancement based on the information provided from the pre-defined system parameter space. The proposed scheme opens up an efficient way for real time security assessment and enhancement in a competitive electricity market for single contingency case
Resumo:
There are many techniques for electricity market price forecasting. However, most of them are designed for expected price analysis rather than price spike forecasting. An effective method of predicting the occurrence of spikes has not yet been observed in the literature so far. In this paper, a data mining based approach is presented to give a reliable forecast of the occurrence of price spikes. Combined with the spike value prediction techniques developed by the same authors, the proposed approach aims at providing a comprehensive tool for price spike forecasting. In this paper, feature selection techniques are firstly described to identify the attributes relevant to the occurrence of spikes. A simple introduction to the classification techniques is given for completeness. Two algorithms: support vector machine and probability classifier are chosen to be the spike occurrence predictors and are discussed in details. Realistic market data are used to test the proposed model with promising results.
Resumo:
The ‘leading coordinate’ approach to computing an approximate reaction pathway, with subsequent determination of the true minimum energy profile, is applied to a two-proton chain transfer model based on the chromophore and its surrounding moieties within the green fluorescent protein (GFP). Using an ab initio quantum chemical method, a number of different relaxed energy profiles are found for several plausible guesses at leading coordinates. The results obtained for different trial leading coordinates are rationalized through the calculation of a two-dimensional relaxed potential energy surface (PES) for the system. Analysis of the 2-D relaxed PES reveals that two of the trial pathways are entirely spurious, while two others contain useful information and can be used to furnish starting points for successful saddle-point searches. Implications for selection of trial leading coordinates in this class of proton chain transfer reactions are discussed, and a simple diagnostic function is proposed for revealing whether or not a relaxed pathway based on a trial leading coordinate is likely to furnish useful information.
Resumo:
Grid computing is an advanced technique for collaboratively solving complicated scientific problems using geographically and organisational dispersed computational, data storage and other recourses. Application of grid computing could provide significant benefits to all aspects of power system that involves using computers. Based on our previous research, this paper presents a novel grid computing approach for probabilistic small signal stability (PSSS) analysis in electric power systems with uncertainties. A prototype computing grid is successfully implemented in our research lab to carry out PSSS analysis on two benchmark systems. Comparing to traditional computing techniques, the gird computing has given better performances for PSSS analysis in terms of computing capacity, speed, accuracy and stability. In addition, a computing grid framework for power system analysis has been proposed based on the recent study.
Resumo:
The reconstruction of power industries has brought fundamental changes to both power system operation and planning. This paper presents a new planning method using multi-objective optimization (MOOP) technique, as well as human knowledge, to expand the transmission network in open access schemes. The method starts with a candidate pool of feasible expansion plans. Consequent selection of the best candidates is carried out through a MOOP approach, of which multiple objectives are tackled simultaneously, aiming at integrating the market operation and planning as one unified process in context of deregulated system. Human knowledge has been applied in both stages to ensure the selection with practical engineering and management concerns. The expansion plan from MOOP is assessed by reliability criteria before it is finalized. The proposed method has been tested with the IEEE 14-bus system and relevant analyses and discussions have been presented.
Resumo:
Bound and resonance states of HO2 have been calculated quantum mechanically by the Lanczos homogeneous filter diagonalization method [Zhang and Smith, Phys. Chem. Chem. Phys. 3, 2282 (2001); J. Chem. Phys. 115, 5751 (2001)] for nonzero total angular momentum J = 1,2,3. For lower bound states, agreement between the results in this paper and previous work is quite satisfactory; while for high lying bound states and resonances these are the first reported results. A helicity quantum number V assignment (within the helicity conserving approximation) is performed and the results indicate that for lower bound states it is possible to assign the V quantum numbers unambiguously, but for resonances it is impossible to assign the V helicity quantum numbers due to strong mixing. In fact, for the high-lying bound states, the mixing has already appeared. These results indicate that the helicity conserving approximation is not good for the resonance state calculations and exact quantum calculations are needed to accurately describe the reaction dynamics for HO2 system. Analysis of the resonance widths shows that most of the resonances are overlapping and the interferences between them lead to large fluctuations from one resonance to another. In accord with the conclusions from earlier J = 0 calculations, this indicates that the dissociation of HO2 is essentially irregular. (C) 2003 American Institute of Physics.
Resumo:
The BR algorithm is a novel and efficient method to find all eigenvalues of upper Hessenberg matrices and has never been applied to eigenanalysis for power system small signal stability. This paper analyzes differences between the BR and the QR algorithms with performance comparison in terms of CPU time based on stopping criteria and storage requirement. The BR algorithm utilizes accelerating strategies to improve its performance when computing eigenvalues of narrowly banded, nearly tridiagonal upper Hessenberg matrices. These strategies significantly reduce the computation time at a reasonable level of precision. Compared with the QR algorithm, the BR algorithm requires fewer iteration steps and less storage space without depriving of appropriate precision in solving eigenvalue problems of large-scale power systems. Numerical examples demonstrate the efficiency of the BR algorithm in pursuing eigenanalysis tasks of 39-, 68-, 115-, 300-, and 600-bus systems. Experiment results suggest that the BR algorithm is a more efficient algorithm for large-scale power system small signal stability eigenanalysis.
Resumo:
Training-needs analysis is critical for defining and procuring effective training systems. However, traditional approaches to training-needs analysis are not suitable for capturing the demands of highly automated and computerized work domains. In this article, we propose that work domain analysis can identify the functional structure of a work domain that must be captured in a training system, so that workers can be trained to deal with unpredictable contingencies that cannot be handled by computer systems. To illustrate this argument, we outline a work domain analysis of a fighter aircraft that defines its functional structure in terms of its training objectives, measures of performance, basic training functions, physical functionality, and physical context. The functional structure or training needs identified by work domain analysis can then be used as a basis for developing functional specifications for training systems, specifically its design objectives, data collection capabilities, scenario generation capabilities, physical functionality, and physical attributes. Finally, work domain analysis also provides a useful framework for evaluating whether a tendered solution fulfills the training needs of a work domain.
Resumo:
Despite apparent overwhelming benefits, implementation of the Household Responsibility System (HRS) in China contained a number of flaws. The Two-Farmland System (TFS), which originated in Pingdu City in Shandong Province, sought to address the twin problems of land fragmentation and economies of size. A stochastic frontier production function analysis that isolates the impacts of land allocation reforms suggests that the TFS increased efficiency by around 7%. This article highlights the need for empirical analysis to assess objectively the merits or otherwise of particular reforms. (C) 2002 Elsevier Science Inc. All rights reserved.
Resumo:
Solid earth simulations have recently been developed to address issues such as natural disasters, global environmental destruction and the conservation of natural resources. The simulation of solid earth phenomena involves the analysis of complex structures including strata, faults, and heterogeneous material properties. Simulation of the generation and cycle of earthquakes is particularly important, but such simulations require the analysis of complex fault dynamics. GeoFEM is a parallel finite-element analysis system intended for solid earth field phenomena problems. This paper describes recent development in the GeoFEM project for the simulation of earthquake generation and cycles.
Resumo:
We analyzed the mouse Representative Transcript and Protein Set for molecules involved in brain function. We found full-length cDNAs of many known brain genes and discovered new members of known brain gene families, including Family 3 G-protein coupled receptors, voltage-gated channels, and connexins. We also identified previously unknown candidates for secreted neuroactive molecules. The existence of a large number of unique brain ESTs suggests an additional molecular complexity that remains to be explored. A list of genes containing CAG stretches in the coding region represents a first step in the potential identification of candidates for hereditary neurological disorders.
Resumo:
The volume of the primary (PCS) and secondary (SCS) circulatory system in the Atlantic cod Gadus morhua was determined using a modified dye dilution technique. Cod (N=10) were chronically cannulated in the second afferent branchial artery with PE-50 tubing. Evans Blue dye was bound to harvested fish plasma at a concentration of 1 mg dye ml(-1) plasma, and injected at a concentration of 1 mg kg(-1) body mass. Serial sampling from the cannula produced a dye dilution curve, which could be described by a double exponential decay equation. Curve analysis enabled the calculation of the primary circulatory and total distribution volume. The difference between these volumes is assumed to be the volume of the SCS. From the dilution curve, it was also possible to calculate flow rates between and within the systems. The results of these experiments suggest a plasma volume in the PCS of 3.42+/-0.89 ml 100 g(-1) body mass, and in the SCS of 1.68+/-0.35 ml 100 g(-1) body mass (mean +/- S.D.) or approximately 50% that of the PCS. Flow rates to the SCS were calculated as 2.7% of the resting cardiac output. There was an allometric relationship between body mass and blood volumes. Increasing condition factor showed a tendency towards smaller blood volumes of the PCS, expressed as percentage body mass, but this was not evident for the volume of the SCS.