955 resultados para Mixed complementarity problem


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A variational inequality problem (VIP) satisfying a constraint qualification can be reduced to a mixed complementarity problem (MCP). Monotonicity of the VIP implies that the MCP is also monotone. Introducing regularizing perturbations, a sequence of strictly monotone mixed complementarity problems is generated. It is shown that, if the original problem is solvable, the sequence of computable inexact solutions of the strictly monotone MCP's is bounded and every accumulation point is a solution. Under an additional condition on the precision used for solving each subproblem, the sequence converges to the minimum norm solution of the MCP. Copyright © 2000 by Marcel Dekker, Inc.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A reformulation of the bounded mixed complementarity problem is introduced. It is proved that the level sets of the objective function are bounded and, under reasonable assumptions, stationary points coincide with solutions of the original variational inequality problem. Therefore, standard minimization algorithms applied to the new reformulation must succeed. This result is applied to the compactification of unbounded mixed complementarity problems. © 2001 OPA (Overseas Publishers Association) N.V. Published by license under the Gordon and Breach Science Publishers imprint, a member of the Taylor & Francis Group.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The extended linear complementarity problem (XLCP) has been introduced in a recent paper by Mangasarian and Pang. In the present research, minimization problems with simple bounds associated to this problem are defined. When the XLCP is solvable, their solutions are global minimizers of the associated problems. Sufficient conditions that guarantee that stationary points of the associated problems are solutions of the XLCP will be proved. These theoretical results support the conjecture that local methods for box constrained optimization applied to the associated problems could be efficient tools for solving the XLCP. (C) 1998 Elsevier B.V. All rights reserved.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Minimization of a differentiable function subject to box constraints is proposed as a strategy to solve the generalized nonlinear complementarity problem (GNCP) defined on a polyhedral cone. It is not necessary to calculate projections that complicate and sometimes even disable the implementation of algorithms for solving these kinds of problems. Theoretical results that relate stationary points of the function that is minimized to the solutions of the GNCP are presented. Perturbations of the GNCP are also considered, and results are obtained related to the resolution of GNCPs with very general assumptions on the data. These theoretical results show that local methods for box-constrained optimization applied to the associated problem are efficient tools for solving the GNCP. Numerical experiments are presented that encourage the use of this approach.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

‪This dissertation examines the impacts of energy and climate policies on the energy and forest sectors, focusing on the case of Finland. The thesis consists of an introduction article and four separate studies. The dissertation was motivated by the climate concern and the increasing demand of renewable energy. In particular, the renewable energy consumption and greenhouse gas emission reduction targets of the European Union were driving this work. In Finland, both forest and energy sectors are in key roles in achieving these targets. In fact, the separation between forest and energy sector is diminishing as the energy sector is utilizing increasing amounts of wood in energy production and as the forest sector is becoming more and more important energy producer.‬ ‪The objective of this dissertation is to find out and measure the impacts of climate and energy policies on the forest and energy sectors. In climate policy, the focus is on emissions trading, and in energy policy the dissertation focuses on the promotion of renewable forest-based energy use. The dissertation relies on empirical numerical models that are based on microeconomic theory. Numerical partial equilibrium mixed complementarity problem models were constructed to study the markets under scrutiny. The separate studies focus on co-firing of wood biomass and fossil fuels, liquid biofuel production in the pulp and paper industry, and the impacts of climate policy on the pulp and paper sector.‬ ‪The dissertation shows that the policies promoting wood-based energy may have have unexpected negative impacts. When feed-in tariff is imposed together with emissions trading, in some plants the production of renewable electricity might decrease as the emissions price increases. The dissertation also shows that in liquid biofuel production, investment subsidy may cause high direct policy costs and other negative impacts when compared to other policy instruments. The results of the dissertation also indicate that from the climate mitigation perspective, perfect competition is the favored wood market competition structure, at least if the emissions trading system is not global.‬ ‪In conclusion, this dissertation suggests that when promoting the use of wood biomass in energy production, the favored policy instruments are subsidies that promote directly the renewable energy production (i.e. production subsidy, renewables subsidy or feed-in premium). Also, the policy instrument should be designed to be dependent on the emissions price or on the substitute price. In addition, this dissertation shows that when planning policies to promote wood-based renewable energy, the goals of the policy scheme should be clear before decisions are made on the choice of the policy instruments.‬

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper, three metaheuristics are proposed for solving a class of job shop, open shop, and mixed shop scheduling problems. We evaluate the performance of the proposed algorithms by means of a set of Lawrence’s benchmark instances for the job shop problem, a set of randomly generated instances for the open shop problem, and a combined job shop and open shop test data for the mixed shop problem. The computational results show that the proposed algorithms perform extremely well on all these three types of shop scheduling problems. The results also reveal that the mixed shop problem is relatively easier to solve than the job shop problem due to the fact that the scheduling procedure becomes more flexible by the inclusion of more open shop jobs in the mixed shop.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

This paper addresses the challenges of flood mapping using multispectral images. Quantitative flood mapping is critical for flood damage assessment and management. Remote sensing images obtained from various satellite or airborne sensors provide valuable data for this application, from which the information on the extent of flood can be extracted. However the great challenge involved in the data interpretation is to achieve more reliable flood extent mapping including both the fully inundated areas and the 'wet' areas where trees and houses are partly covered by water. This is a typical combined pure pixel and mixed pixel problem. In this paper, an extended Support Vector Machines method for spectral unmixing developed recently has been applied to generate an integrated map showing both pure pixels (fully inundated areas) and mixed pixels (trees and houses partly covered by water). The outputs were compared with the conventional mean based linear spectral mixture model, and better performance was demonstrated with a subset of Landsat ETM+ data recorded at the Daly River Basin, NT, Australia, on 3rd March, 2008, after a flood event.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

The transmission network planning problem is a non-linear integer mixed programming problem (NLIMP). Most of the algorithms used to solve this problem use a linear programming subroutine (LP) to solve LP problems resulting from planning algorithms. Sometimes the resolution of these LPs represents a major computational effort. The particularity of these LPs in the optimal solution is that only some inequality constraints are binding. This task transforms the LP into an equivalent problem with only one equality constraint (the power flow equation) and many inequality constraints, and uses a dual simplex algorithm and a relaxation strategy to solve the LPs. The optimisation process is started with only one equality constraint and, in each step, the most unfeasible constraint is added. The logic used is similar to a proposal for electric systems operation planning. The results show a higher performance of the algorithm when compared to primal simplex methods.

Relevância:

90.00% 90.00%

Publicador:

Resumo:

A bounded-level-set result for a reformulation of the box-constrained variational inequality problem proposed recently by Facchinei, Fischer and Kanzow is proved. An application of this result to the (unbounded) nonlinear complementarity problem is suggested. © 1999 Elsevier Science Ltd. All rights reserved.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Reading plays an important role in establishing lifelong learning and providing the reader with an avenue to new experiences and a language with which to express their ideas and feelings (Owen 2003; Hamston & Love 2005). In particular adolescents need a language that allows them to 'play with their identities in a safe and controlled manner to explore who they want to be in this ever changing world' (Koss & Teale 2009, 569). Block (1995) advances that there is a distinct correlation between what we read and how we live in the world, and argues 'if what we read influences our identity in the world, the ways we are able to imagine and live in the world, then there is some responsibility to address these various texts, their readers and possible reading experiences' (Koss & Teale 2009, 569). Within my research I attempt to take on this responsibility by establishing a connection between reluctant adolescent male readers, and their reading experiences and by using their opinions to create a novella that seeks to more fully engage them. Centred within the larger debate about boys and books are two central discussions: why don't boys read and what should boys read? While a number of reasons why adolescent boys don't read are mentioned in this paper and it might not be possible to fully account for why many are reluctant readers, it is possible to argue that specific forms of literature addressing certain themes and topics relevant to the age group might appeal to reluctant readers. The conceptual framework for this research was structured using a mixed-method approach consisting of four phases. In positioning my research for determining literature that reluctant readers may want to read I draw on a variety of material which tends to support the longevity of S.E Hinton's (1967) argument that 'teenagers today, want to read about teenagers today' (cited in Smith & Wilhelm 2002, 6). My practice-based research was conducted within a high school in Brisbane, Australia. Six participants were selected and required to read three recently published Australian Young Adult novels, and opinion was collected via semi-structured interviews on these case studies. Grounded Theory (Charmaz 2003; Charmaz 2006; Glaser & Strauss 2011) informed the design of the questions, and the process of concurrent interviews and analysis of opinion. This analysis led to construction of my theory: adolescent male reluctant readers want to read about female relationships and family conflict within a story that consists of an adventure that, although unlikely to happen, could happen. From this study there are two main contributions, which have theoretical and practical implications for stakeholders with a vested interest in the discussion regarding boys and books. First, this study, through the research methodology, presents key findings that indicate that reluctant readers are interested in realistic texts addressing themes that will help with the construction of, and understanding of, their own lives. Secondly, the grounded theory derived from these findings is applied to my own praxis and my creative artefact (Duende) is included with this exegesis as a text intended to create a connection between engaging texts and adolescent male reluctant readers.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

The most difficult operation in the flood inundation mapping using optical flood images is to separate fully inundated areas from the ‘wet’ areas where trees and houses are partly covered by water. This can be referred as a typical problem the presence of mixed pixels in the images. A number of automatic information extraction image classification algorithms have been developed over the years for flood mapping using optical remote sensing images. Most classification algorithms generally, help in selecting a pixel in a particular class label with the greatest likelihood. However, these hard classification methods often fail to generate a reliable flood inundation mapping because the presence of mixed pixels in the images. To solve the mixed pixel problem advanced image processing techniques are adopted and Linear Spectral unmixing method is one of the most popular soft classification technique used for mixed pixel analysis. The good performance of linear spectral unmixing depends on two important issues, those are, the method of selecting endmembers and the method to model the endmembers for unmixing. This paper presents an improvement in the adaptive selection of endmember subset for each pixel in spectral unmixing method for reliable flood mapping. Using a fixed set of endmembers for spectral unmixing all pixels in an entire image might cause over estimation of the endmember spectra residing in a mixed pixel and hence cause reducing the performance level of spectral unmixing. Compared to this, application of estimated adaptive subset of endmembers for each pixel can decrease the residual error in unmixing results and provide a reliable output. In this current paper, it has also been proved that this proposed method can improve the accuracy of conventional linear unmixing methods and also easy to apply. Three different linear spectral unmixing methods were applied to test the improvement in unmixing results. Experiments were conducted in three different sets of Landsat-5 TM images of three different flood events in Australia to examine the method on different flooding conditions and achieved satisfactory outcomes in flood mapping.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Land cover (LC) refers to what is actually present on the ground and provide insights into the underlying solution for improving the conditions of many issues, from water pollution to sustainable economic development. One of the greatest challenges of modeling LC changes using remotely sensed (RS) data is of scale-resolution mismatch: that the spatial resolution of detail is less than what is required, and that this sub-pixel level heterogeneity is important but not readily knowable. However, many pixels consist of a mixture of multiple classes. The solution to mixed pixel problem typically centers on soft classification techniques that are used to estimate the proportion of a certain class within each pixel. However, the spatial distribution of these class components within the pixel remains unknown. This study investigates Orthogonal Subspace Projection - an unmixing technique and uses pixel-swapping algorithm for predicting the spatial distribution of LC at sub-pixel resolution. Both the algorithms are applied on many simulated and actual satellite images for validation. The accuracy on the simulated images is ~100%, while IRS LISS-III and MODIS data show accuracy of 76.6% and 73.02% respectively. This demonstrates the relevance of these techniques for applications such as urban-nonurban, forest-nonforest classification studies etc.

Relevância:

80.00% 80.00%

Publicador:

Resumo:

Rapid urbanisation in India has posed serious challenges to the decision makers in regional planning involving plethora of issues including provision of basic amenities (like electricity, water, sanitation, transport, etc.). Urban planning entails an understanding of landscape and urban dynamics with causal factors. Identifying, delineating and mapping landscapes on temporal scale provide an opportunity to monitor the changes, which is important for natural resource management and sustainable planning activities. Multi-source, multi-sensor, multi-temporal, multi-frequency or multi-polarization remote sensing data with efficient classification algorithms and pattern recognition techniques aid in capturing these dynamics. This paper analyses the landscape dynamics of Greater Bangalore by: (i) characterisation of direct impervious surface, (ii) computation of forest fragmentation indices and (iii) modeling to quantify and categorise urban changes. Linear unmixing is used for solving the mixed pixel problem of coarse resolution super spectral MODIS data for impervious surface characterisation. Fragmentation indices were used to classify forests – interior, perforated, edge, transitional, patch and undetermined. Based on this, urban growth model was developed to determine the type of urban growth – Infill, Expansion and Outlying growth. This helped in visualising urban growth poles and consequence of earlier policy decisions that can help in evolving strategies for effective land use policies.