9 resultados para New methodology
em University of Queensland eSpace - Australia
Resumo:
Market administrators hold the vital role of maintaining sufficient generation capacity in their respective electricity market. However without the jurisdiction to dictate the generator types, locations and timing of new generation, the reliability of the system may be compromised by delayed entry of new generation. This paper illustrates a new generation investment methodology that can effectively present expected returns from the pool market; while concurrently searching for the type and placement of a new generator to fulfil system reliability requirements.
Resumo:
We introduce a new second-order method of texture analysis called Adaptive Multi-Scale Grey Level Co-occurrence Matrix (AMSGLCM), based on the well-known Grey Level Co-occurrence Matrix (GLCM) method. The method deviates significantly from GLCM in that features are extracted, not via a fixed 2D weighting function of co-occurrence matrix elements, but by a variable summation of matrix elements in 3D localized neighborhoods. We subsequently present a new methodology for extracting optimized, highly discriminant features from these localized areas using adaptive Gaussian weighting functions. Genetic Algorithm (GA) optimization is used to produce a set of features whose classification worth is evaluated by discriminatory power and feature correlation considerations. We critically appraised the performance of our method and GLCM in pairwise classification of images from visually similar texture classes, captured from Markov Random Field (MRF) synthesized, natural, and biological origins. In these cross-validated classification trials, our method demonstrated significant benefits over GLCM, including increased feature discriminatory power, automatic feature adaptability, and significantly improved classification performance.
Resumo:
A new methodology is proposed for the analysis of generation capacity investment in a deregulated market environment. This methodology proposes to make the investment appraisal using a probabilistic framework. The probabilistic production simulation (PPC) algorithm is used to compute the expected energy generated, taking into account system load variations and plant forced outage rates, while the Monte Carlo approach has been applied to model the electricity price variability seen in a realistic network. The model is able to capture the price and hence the profitability uncertainties for generator companies. Seasonal variation in the electricity prices and the system demand are independently modeled. The method is validated on IEEE RTS system, augmented with realistic market and plant data, by using it to compare the financial viability of several generator investments applying either conventional or directly connected generator (powerformer) technologies. The significance of the results is assessed using several financial risk measures.
Resumo:
A new device has been developed to directly measure the bubble loading of particle-bubble aggregates in industrial flotation machines, both mechanical flotation cells as well as flotation column cells. The bubble loading of aggregates allows for in-depth analysis of the operating performance of a flotation machine in terms of both pulp/collection zone and froth zone performance. This paper presents the methodology along with an example showing the excellent reproducibility of the device and an analysis of different operating conditions of the device itself. (C) 2004 Elsevier B.V All rights reserved.
Resumo:
The Integration-Responsiveness framework of Prahalad and Doz (1987) has been used extensively in the international business literature to typify the diverse and often-conflicting environmental pressures confronting firms as they expand worldwide. Although the IR framework has been successfully applied for over a decade, many theoretical and empirical studies have focused on the consequences of these pressures rather than the pressures themselves. Prahalad and Doz identified the economic, technological, political, customer and competitive factors that create the global integration and local responsiveness pressures on the diverse businesses and functions in MNEs. This article explains the methodology, including the procedure for data collection and analysis. The researchers conclude with a discussion of their findings and directions for future research, speculating as to the appropriate definition of the domain of IR pressures and the criteria they might use to validate measures of these.
Resumo:
A calibration methodology based on an efficient and stable mathematical regularization scheme is described. This scheme is a variant of so-called Tikhonov regularization in which the parameter estimation process is formulated as a constrained minimization problem. Use of the methodology eliminates the need for a modeler to formulate a parsimonious inverse problem in which a handful of parameters are designated for estimation prior to initiating the calibration process. Instead, the level of parameter parsimony required to achieve a stable solution to the inverse problem is determined by the inversion algorithm itself. Where parameters, or combinations of parameters, cannot be uniquely estimated, they are provided with values, or assigned relationships with other parameters, that are decreed to be realistic by the modeler. Conversely, where the information content of a calibration dataset is sufficient to allow estimates to be made of the values of many parameters, the making of such estimates is not precluded by preemptive parsimonizing ahead of the calibration process. White Tikhonov schemes are very attractive and hence widely used, problems with numerical stability can sometimes arise because the strength with which regularization constraints are applied throughout the regularized inversion process cannot be guaranteed to exactly complement inadequacies in the information content of a given calibration dataset. A new technique overcomes this problem by allowing relative regularization weights to be estimated as parameters through the calibration process itself. The technique is applied to the simultaneous calibration of five subwatershed models, and it is demonstrated that the new scheme results in a more efficient inversion, and better enforcement of regularization constraints than traditional Tikhonov regularization methodologies. Moreover, it is argued that a joint calibration exercise of this type results in a more meaningful set of parameters than can be achieved by individual subwatershed model calibration. (c) 2005 Elsevier B.V. All rights reserved.
Resumo:
Primary objective: To trial the method of email-facilitated qualitative interviewing with people with traumatic brain injury (TBI). Research design: Qualitative semi-structured email-facilitated interviews. Procedures: Nineteen people (17 severe diagnosis) with a TBI participated in email interviews. Main outcomes and results: Findings indicate that this method facilitates the participation of people with TBI in qualitative interviews. Advantages include increased time for reflection, composing answers and greater control of the interview setting. In addition, the data indicates that people with a TBI are capable of greater insight, reflection and humour than indicated by previous research. Conclusion: Findings indicate that new technologies may advance data collection methods for people with cognitive-linguistic impairments who face participation barriers in face-to-face interviews.