996 resultados para Decomposition framework


Relevância:

70.00% 70.00%

Publicador:

Resumo:

In this paper a framework based on the decomposition of the first-order optimality conditions is described and applied to solve the Probabilistic Power Flow (PPF) problem in a coordinated but decentralized way in the context of multi-area power systems. The purpose of the decomposition framework is to solve the problem through a process of solving smaller subproblems, associated with each area of the power system, iteratively. This strategy allows the probabilistic analysis of the variables of interest, in a particular area, without explicit knowledge of network data of the other interconnected areas, being only necessary to exchange border information related to the tie-lines between areas. An efficient method for probabilistic analysis, considering uncertainty in n system loads, is applied. The proposal is to use a particular case of the point estimate method, known as Two-Point Estimate Method (TPM), rather than the traditional approach based on Monte Carlo simulation. The main feature of the TPM is that it only requires resolve 2n power flows for to obtain the behavior of any random variable. An iterative coordination algorithm between areas is also presented. This algorithm solves the Multi-Area PPF problem in a decentralized way, ensures the independent operation of each area and integrates the decomposition framework and the TPM appropriately. The IEEE RTS-96 system is used in order to show the operation and effectiveness of the proposed approach and the Monte Carlo simulations are used to validation of the results. © 2011 IEEE.

Relevância:

70.00% 70.00%

Publicador:

Resumo:

Process systems design, operation and synthesis problems under uncertainty can readily be formulated as two-stage stochastic mixed-integer linear and nonlinear (nonconvex) programming (MILP and MINLP) problems. These problems, with a scenario based formulation, lead to large-scale MILPs/MINLPs that are well structured. The first part of the thesis proposes a new finitely convergent cross decomposition method (CD), where Benders decomposition (BD) and Dantzig-Wolfe decomposition (DWD) are combined in a unified framework to improve the solution of scenario based two-stage stochastic MILPs. This method alternates between DWD iterations and BD iterations, where DWD restricted master problems and BD primal problems yield a sequence of upper bounds, and BD relaxed master problems yield a sequence of lower bounds. A variant of CD, which includes multiple columns per iteration of DW restricted master problem and multiple cuts per iteration of BD relaxed master problem, called multicolumn-multicut CD is then developed to improve solution time. Finally, an extended cross decomposition method (ECD) for solving two-stage stochastic programs with risk constraints is proposed. In this approach, a CD approach at the first level and DWD at a second level is used to solve the original problem to optimality. ECD has a computational advantage over a bilevel decomposition strategy or solving the monolith problem using an MILP solver. The second part of the thesis develops a joint decomposition approach combining Lagrangian decomposition (LD) and generalized Benders decomposition (GBD), to efficiently solve stochastic mixed-integer nonlinear nonconvex programming problems to global optimality, without the need for explicit branch and bound search. In this approach, LD subproblems and GBD subproblems are systematically solved in a single framework. The relaxed master problem obtained from the reformulation of the original problem, is solved only when necessary. A convexification of the relaxed master problem and a domain reduction procedure are integrated into the decomposition framework to improve solution efficiency. Using case studies taken from renewable resource and fossil-fuel based application in process systems engineering, it can be seen that these novel decomposition approaches have significant benefit over classical decomposition methods and state-of-the-art MILP/MINLP global optimization solvers.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Entity-oriented search has become an essential component of modern search engines. It focuses on retrieving a list of entities or information about the specific entities instead of documents. In this paper, we study the problem of finding entity related information, referred to as attribute-value pairs, that play a significant role in searching target entities. We propose a novel decomposition framework combining reduced relations and the discriminative model, Conditional Random Field (CRF), for automatically finding entity-related attribute-value pairs from free text documents. This decomposition framework allows us to locate potential text fragments and identify the hidden semantics, in the form of attribute-value pairs for user queries. Empirical analysis shows that the decomposition framework outperforms pattern-based approaches due to its capability of effective integration of syntactic and semantic features.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

House prices in the Australian capital cities have been increasing over the last two decades. An over 10% average annual increase arises in the capital cities. In Melbourne, Brisbane and Perth, the house prices increased by more than 15% annually, while the house prices in Darwin increased by even higher at about 21%. It is surprising that, after a decrease in 2008, the house prices in the Australian capital cities show a strong recovery in their last financial year’s increase. How to read the house prices in cities across a country has been an issue of public interest since the late 1980s. Various models were developed to investigate the behaviours of house prices over time or space. A spatio-temporal model, introduced in recent literature, appears advantages in accounting for the spatial effects on house prices. However, the decay of temporal effects and temporal dynamics of the spatial effects cannot be addressed by the spatio-temporal model. This research will suggest a three-part decomposition framework in reading urban house price behaviours. Based on the spatio-temporal model, a time weighted spatio-temporal model is developed. This new model assumes that an urban house price movement should be decomposed by urban characterised factors, time correlated factors and space correlated factors. A time weighted is constructed to capture the temporal decay of the time correlated effects, while a spatio-temporal weight is constructed to account for the timevaried space correlated effects. The house prices of the Australian capital cities are investigated by using the time weighted spatio-temporal model. The empirical findings suggest that the housing markets should be clustered by their geographic locations. The rest parts of this paper are organised as follows. The following section will present a principle for reading urban house prices. The next section will outline the methodologies modelling the time weighted spatio-temporal model. The subsequent section will report the relative data and empirical results, while the final section will generate the conclusions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Non-linear behavior of soils during a seismic event has a predominant role in current site response analysis. Soil response analysis consistently indicates that the stress-strain relationship of soils is nonlinear and shows hysteresis. When focusing in forced response simulations, time integrations based on modal analysis are widely considered, however parametric analysis, non-linear behavior and complex damping functions make difficult the online use of standard discretization strategies, e.g. those based on the use of finite element. In this paper we propose a new harmonic analysis formulation, able to address forced response simulation of soils exhibiting their characteristic nonlinear behavior. The solution can be evaluated in real-time from the offline construction of a parametric solution of the associated linearized problem within the Proper Generalized Decomposition framework.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

Glaucoma is the second leading cause of blindness worldwide. Often, the optic nerve head (ONH) glaucomatous damage and ONH changes occur prior to visual field loss and are observable in vivo. Thus, digital image analysis is a promising choice for detecting the onset and/or progression of glaucoma. In this paper, we present a new framework for detecting glaucomatous changes in the ONH of an eye using the method of proper orthogonal decomposition (POD). A baseline topograph subspace was constructed for each eye to describe the structure of the ONH of the eye at a reference/baseline condition using POD. Any glaucomatous changes in the ONH of the eye present during a follow-up exam were estimated by comparing the follow-up ONH topography with its baseline topograph subspace representation. Image correspondence measures of L-1-norm and L-2-norm, correlation, and image Euclidean distance (IMED) were used to quantify the ONH changes. An ONH topographic library built from the Louisiana State University Experimental Glaucoma study was used to evaluate the performance of the proposed method. The area under the receiver operating characteristic curves (AUCs) was used to compare the diagnostic performance of the POD-induced parameters with the parameters of the topographic change analysis (TCA) method. The IMED and L-2-norm parameters in the POD framework provided the highest AUC of 0.94 at 10 degrees. field of imaging and 0.91 at 15 degrees. field of imaging compared to the TCA parameters with an AUC of 0.86 and 0.88, respectively. The proposed POD framework captures the instrument measurement variability and inherent structure variability and shows promise for improving our ability to detect glaucomatous change over time in glaucoma management.

Relevância:

40.00% 40.00%

Publicador:

Resumo:

A novel route to prepare highly active and stable N2O decomposition catalysts is presented, based on Fe-exchanged beta zeolite. The procedure consists of liquid phase Fe(III) exchange at low pH. By varying the pH systematically from 3.5 to 0, using nitric acid during each Fe(III)-exchange procedure, the degree of dealumination was controlled, verified by ICP and NMR. Dealumination changes the presence of neighbouring octahedral Al sites of the Fe sites, improving the performance for this reaction. The so-obtained catalysts exhibit a remarkable enhancement in activity, for an optimal pH of 1. Further optimization by increasing the Fe content is possible. The optimal formulation showed good conversion levels, comparable to a benchmark Fe-ferrierite catalyst. The catalyst stability under tail gas conditions containing NO, O2 and H2O was excellent, without any appreciable activity decay during 70 h time on stream. Based on characterisation and data analysis from ICP, single pulse excitation NMR, MQ MAS NMR, N2 physisorption, TPR(H2) analysis and apparent activation energies, the improved catalytic performance is attributed to an increased concentration of active sites. Temperature programmed reduction experiments reveal significant changes in the Fe(III) reducibility pattern with the presence of two reduction peaks; tentatively attributed to the interaction of the Fe-oxo species with electron withdrawing extraframework AlO6 species, causing a delayed reduction. A low-temperature peak is attributed to Fe-species exchanged on zeolitic AlO4 sites, which are partially charged by the presence of the neighbouring extraframework AlO6 sites. Improved mass transport phenomena due to acid leaching is ruled out. The increased activity is rationalized by an active site model, whose concentration increases by selectively washing out the distorted extraframework AlO6 species under acidic (optimal) conditions, liberating active Fe species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

As process management projects have increased in size due to globalised and company-wide initiatives, a corresponding growth in the size of process modeling projects can be observed. Despite advances in languages, tools and methodologies, several aspects of these projects have been largely ignored by the academic community. This paper makes a first contribution to a potential research agenda in this field by defining the characteristics of large-scale process modeling projects and proposing a framework of related issues. These issues are derived from a semi -structured interview and six focus groups conducted in Australia, Germany and the USA with enterprise and modeling software vendors and customers. The focus groups confirm the existence of unresolved problems in business process modeling projects. The outcomes provide a research agenda which directs researchers into further studies in global process management, process model decomposition and the overall governance of process modeling projects. It is expected that this research agenda will provide guidance to researchers and practitioners by focusing on areas of high theoretical and practical relevance.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In information retrieval (IR) research, more and more focus has been placed on optimizing a query language model by detecting and estimating the dependencies between the query and the observed terms occurring in the selected relevance feedback documents. In this paper, we propose a novel Aspect Language Modeling framework featuring term association acquisition, document segmentation, query decomposition, and an Aspect Model (AM) for parameter optimization. Through the proposed framework, we advance the theory and practice of applying high-order and context-sensitive term relationships to IR. We first decompose a query into subsets of query terms. Then we segment the relevance feedback documents into chunks using multiple sliding windows. Finally we discover the higher order term associations, that is, the terms in these chunks with high degree of association to the subsets of the query. In this process, we adopt an approach by combining the AM with the Association Rule (AR) mining. In our approach, the AM not only considers the subsets of a query as “hidden” states and estimates their prior distributions, but also evaluates the dependencies between the subsets of a query and the observed terms extracted from the chunks of feedback documents. The AR provides a reasonable initial estimation of the high-order term associations by discovering the associated rules from the document chunks. Experimental results on various TREC collections verify the effectiveness of our approach, which significantly outperforms a baseline language model and two state-of-the-art query language models namely the Relevance Model and the Information Flow model

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents an input-orientated data envelopment analysis (DEA) framework which allows the measurement and decomposition of economic, environmental and ecological efficiency levels in agricultural production across different countries. Economic, environmental and ecological optimisations search for optimal input combinations that minimise total costs, total amount of nutrients, and total amount of cumulative exergy contained in inputs respectively. The application of the framework to an agricultural dataset of 30 OECD countries revealed that (i) there was significant scope to make their agricultural production systemsmore environmentally and ecologically sustainable; (ii) the improvement in the environmental and ecological sustainability could be achieved by being more technically efficient and, even more significantly, by changing the input combinations; (iii) the rankings of sustainability varied significantly across OECD countries within frontier-based environmental and ecological efficiency measures and between frontier-based measures and indicators.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

Diagnostics is based on the characterization of mechanical system condition and allows early detection of a possible fault. Signal processing is an approach widely used in diagnostics, since it allows directly characterizing the state of the system. Several types of advanced signal processing techniques have been proposed in the last decades and added to more conventional ones. Seldom, these techniques are able to consider non-stationary operations. Diagnostics of roller bearings is not an exception of this framework. In this paper, a new vibration signal processing tool, able to perform roller bearing diagnostics in whatever working condition and noise level, is developed on the basis of two data-adaptive techniques as Empirical Mode Decomposition (EMD), Minimum Entropy Deconvolution (MED), coupled by means of the mathematics related to the Hilbert transform. The effectiveness of the new signal processing tool is proven by means of experimental data measured in a test-rig that employs high power industrial size components.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

This paper presents a novel framework to further advance the recent trend of using query decomposition and high-order term relationships in query language modeling, which takes into account terms implicitly associated with different subsets of query terms. Existing approaches, most remarkably the language model based on the Information Flow method are however unable to capture multiple levels of associations and also suffer from a high computational overhead. In this paper, we propose to compute association rules from pseudo feedback documents that are segmented into variable length chunks via multiple sliding windows of different sizes. Extensive experiments have been conducted on various TREC collections and our approach significantly outperforms a baseline Query Likelihood language model, the Relevance Model and the Information Flow model.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the context of removal of organic pollutants from wastewater, sonolysis of CCl4 dissolved in water has been widely investigated. These investigations are either completely experimental or correlate data empirically. In this work, a quantitative model is developed to predict the rate of sonolysis of aqueous CCl4. The model considers the isothermal growth and partially adiabatic collapse of cavitation bubbles containing gas and vapor leading to conditions of high temperatures and pressures in them, attainment of thermodynamic equilibrium at the end of collapse, release of bubble contents into the liquid pool, and reactions in the well-mixed pool. The model successfully predicts the extent of degradation of dissolved CCl4, and the influence of various parameters such as initial concentration of CCl4, temperature, and nature of gas atmosphere above the liquid. in particular, it predicts the results of Hua and Hoffmann (Environ. Sci Technol, 1996, 30, 864-871), who found that degradation is first order with CCl4 and that Argon as well as Ar-O-3 atmospheres give the same results. The framework of the model is capable of quantitatively describing the degradation of many dissolved organics by considering all the involved species.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

The Australian government has recently pledged a reduction in GHGs emissions of 26–28% below the 2005 level by 2030. How big is the challenge for the country to achieve this target in terms of its present emissions profile, recent historical trends, and the contributions to those trends from key proximate factors contributing to emissions? In this paper, we attempt a quantitative judgement of the challenge by using decomposition analysis. Based on the analysis it appears the announced target will be quite challenging to achieve if the average annual mitigating effects from economic restructuring, energy efficiency improvements and movement towards less emissions-intensive energy sources in evidence over 2002–2013 continued through to 2030; however, if the contribution from these mitigating sources in evidence over 2006–2013 can be sustained, achievement of the target will be much less challenging. The challenge for government then will be to provide a policy framework to ensure the more pronounced beneficial impacts of the mitigating factors evidenced during 2006–2013 can be maintained over the years to 2030.

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In this paper, the classical problem of homogenization of elliptic operators in arbitrary domains with periodically oscillating coefficients is considered. Using Bloch wave decomposition, a new proof of convergence is furnished. It sheds new light and offers an alternate way to view the classical results. In a natural way, this method leads us to work in the Fourier space and thus in a framework dual to the one used by L. Tartar [Problemes d'Homogeneisation dans les Equations aux: Derivees Partielles, Cours Peccot au College de Prance, 1977] in his method of homogenization. Further, this technique offers a nontraditional way of calculating the homogenized coefficients which is easy to implement in the computer.