898 resultados para branch and price


Relevância:

100.00% 100.00%

Publicador:

Resumo:

By means of a meta-analysis, this article sets out to estimate average values for the income and price elasticities of gasoline demand and to analyse the reasons for the variations in the elasticities reported by the literature. The findings show that there is publication bias, that the volatility of elasticity estimates is not due to sampling errors alone, and that there are systematic factors explaining these differences. The income and price elasticities of gasoline demand differ between the short and long run and by region, and the estimation can appropriately include the vehicle fleet and the prices of substitute goods, the data types and the estimation methods used. The presence of a low price elasticity suggests that a fuel tax will be inadequate to control rising consumption in a context of rapid economic growth.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In this paper a novel Branch and Bound (B&B) algorithm to solve the transmission expansion planning which is a non-convex mixed integer nonlinear programming problem (MINLP) is presented. Based on defining the options of the separating variables and makes a search in breadth, we call this algorithm a B&BML algorithm. The proposed algorithm is implemented in AMPL and an open source Ipopt solver is used to solve the nonlinear programming (NLP) problems of all candidates in the B&B tree. Strategies have been developed to address the problem of non-linearity and non-convexity of the search region. The proposed algorithm is applied to the problem of long-term transmission expansion planning modeled as an MINLP problem. The proposed algorithm has carried out on five commonly used test systems such as Garver 6-Bus, IEEE 24-Bus, 46-Bus South Brazilian test systems, Bolivian 57-Bus, and Colombian 93-Bus. Results show that the proposed methodology not only can find the best known solution but it also yields a large reduction between 24% to 77.6% in the number of NLP problems regarding to the size of the systems.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Coordenação de Aperfeiçoamento de Pessoal de Nível Superior (CAPES)

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Neste artigo propomos um algoritmo branch and cut com novas inequações específicas ao problema de planejamento da expansão de redes de transmissão de energia elétrica. Todas as inequações propostas neste trabalho são válidas tanto para os modelos lineares como para os modelos não lineares do problema. Os testes computacionais têm mostrado a eficiência do método proposto neste trabalho quando aplicado a subsistemas reais brasileiros e ao sistema colombiano.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Study IReal Wage Determination in the Swedish Engineering Industry This study uses the monopoly union model to examine the determination of real wages and in particular the effects of active labour market programmes (ALMPs) on real wages in the engineering industry. Quarterly data for the period 1970:1 to 1996:4 are used in a cointegration framework, utilising the Johansen's maximum likelihood procedure. On a basis of the Johansen (trace) test results, vector error correction (VEC) models are created in order to model the determination of real wages in the engineering industry. The estimation results support the presence of a long-run wage-raising effect to rises in the labour productivity, in the tax wedge, in the alternative real consumer wage and in real UI benefits. The estimation results also support the presence of a long-run wage-raising effect due to positive changes in the participation rates regarding ALMPs, relief jobs and labour market training. This could be interpreted as meaning that the possibility of being a participant in an ALMP increases the utility for workers of not being employed in the industry, which in turn could increase real wages in the industry in the long run. Finally, the estimation results show evidence of a long-run wage-reducing effect due to positive changes in the unemployment rate. Study IIIntersectoral Wage Linkages in Sweden The purpose of this study is to investigate whether the wage-setting in certain sectors of the Swedish economy affects the wage-setting in other sectors. The theoretical background is the Scandinavian model of inflation, which states that the wage-setting in the sectors exposed to international competition affects the wage-setting in the sheltered sectors of the economy. The Johansen maximum likelihood cointegration approach is applied to quarterly data on Swedish sector wages for the period 1980:1–2002:2. Different vector error correction (VEC) models are created, based on assumptions as to which sectors are exposed to international competition and which are not. The adaptability of wages between sectors is then tested by imposing restrictions on the estimated VEC models. Finally, Granger causality tests are performed in the different restricted/unrestricted VEC models to test for sector wage leadership. The empirical results indicate considerable adaptability in wages as between manufacturing, construction, the wholesale and retail trade, the central government sector and the municipalities and county councils sector. This is consistent with the assumptions of the Scandinavian model. Further, the empirical results indicate a low level of adaptability in wages as between the financial sector and manufacturing, and between the financial sector and the two public sectors. The Granger causality tests provide strong evidence for the presence of intersectoral wage causality, but no evidence of a wage-leading role in line with the assumptions of the Scandinavian model for any of the sectors. Study IIIWage and Price Determination in the Private Sector in Sweden The purpose of this study is to analyse wage and price determination in the private sector in Sweden during the period 1980–2003. The theoretical background is a variant of the “Imperfect competition model of inflation”, which assumes imperfect competition in the labour and product markets. According to the model wages and prices are determined as a result of a “battle of mark-ups” between trade unions and firms. The Johansen maximum likelihood cointegration approach is applied to quarterly Swedish data on consumer prices, import prices, private-sector nominal wages, private-sector labour productivity and the total unemployment rate for the period 1980:1–2003:3. The chosen cointegration rank of the estimated vector error correction (VEC) model is two. Thus, two cointegration relations are assumed: one for private-sector nominal wage determination and one for consumer price determination. The estimation results indicate that an increase of consumer prices by one per cent lifts private-sector nominal wages by 0.8 per cent. Furthermore, an increase of private-sector nominal wages by one per cent increases consumer prices by one per cent. An increase of one percentage point in the total unemployment rate reduces private-sector nominal wages by about 4.5 per cent. The long-run effects of private-sector labour productivity and import prices on consumer prices are about –1.2 and 0.3 per cent, respectively. The Rehnberg agreement during 1991–92 and the monetary policy shift in 1993 affected the determination of private-sector nominal wages, private-sector labour productivity, import prices and the total unemployment rate. The “offensive” devaluation of the Swedish krona by 16 per cent in 1982:4, and the start of a floating Swedish krona and the substantial depreciation of the krona at this time affected the determination of import prices.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This thesis deals with an investigation of Decomposition and Reformulation to solve Integer Linear Programming Problems. This method is often a very successful approach computationally, producing high-quality solutions for well-structured combinatorial optimization problems like vehicle routing, cutting stock, p-median and generalized assignment . However, until now the method has always been tailored to the specific problem under investigation. The principal innovation of this thesis is to develop a new framework able to apply this concept to a generic MIP problem. The new approach is thus capable of auto-decomposition and autoreformulation of the input problem applicable as a resolving black box algorithm and works as a complement and alternative to the normal resolving techniques. The idea of Decomposing and Reformulating (usually called in literature Dantzig and Wolfe Decomposition DWD) is, given a MIP, to convexify one (or more) subset(s) of constraints (slaves) and working on the partially convexified polyhedron(s) obtained. For a given MIP several decompositions can be defined depending from what sets of constraints we want to convexify. In this thesis we mainly reformulate MIPs using two sets of variables: the original variables and the extended variables (representing the exponential extreme points). The master constraints consist of the original constraints not included in any slaves plus the convexity constraint(s) and the linking constraints(ensuring that each original variable can be viewed as linear combination of extreme points of the slaves). The solution procedure consists of iteratively solving the reformulated MIP (master) and checking (pricing) if a variable of reduced costs exists, and in which case adding it to the master and solving it again (columns generation), or otherwise stopping the procedure. The advantage of using DWD is that the reformulated relaxation gives bounds stronger than the original LP relaxation, in addition it can be incorporated in a Branch and bound scheme (Branch and Price) in order to solve the problem to optimality. If the computational time for the pricing problem is reasonable this leads in practice to a stronger speed up in the solution time, specially when the convex hull of the slaves is easy to compute, usually because of its special structure.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Retinal vein occlusion is a leading cause of visual impairment. Experimental models of this condition based on laser photocoagulation of retinal veins have been described and extensively exploited in mammals and larger rodents such as the rat. However, few reports exist on the use of this paradigm in the mouse. The objective of this study was to investigate a model of branch and central retinal vein occlusion in the mouse and characterize in vivo longitudinal retinal morphology alterations using spectral domain optical coherence tomography. Retinal veins were experimentally occluded using laser photocoagulation after intravenous application of Rose Bengal, a photo-activator dye enhancing thrombus formation. Depending on the number of veins occluded, variable amounts of capillary dropout were seen on fluorescein angiography. Vascular endothelial growth factor levels were markedly elevated early and peaked at day one. Retinal thickness measurements with spectral domain optical coherence tomography showed significant swelling (p<0.001) compared to baseline, followed by gradual thinning plateauing two weeks after the experimental intervention (p<0.001). Histological findings at day seven correlated with spectral domain optical coherence tomography imaging. The inner layers were predominantly affected by degeneration with the outer nuclear layer and the photoreceptor outer segments largely preserved. The application of this retinal vein occlusion model in the mouse carries several advantages over its use in other larger species, such as access to a vast range of genetically modified animals. Retinal changes after experimental retinal vein occlusion in this mouse model can be non-invasively quantified by spectral domain optical coherence tomography, and may be used to monitor effects of potential therapeutic interventions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

A study was carried out to evaluate preferences for two cuts, four countries of origin, two forms of presentation, brand and different prices of beef cattle among supermarket buyers in southern Chile, and to distinguish the existence of different market segments, through a survey of 800 people. Using a fractional factorial design for conjoint analysis, it was determined overall that the origin was more important (44.5%) than price (20.8%), form of presentation (12.2%), cut (12.0%) and brand (10.5%), with preference for Chilean and Argentinean striploin, packaged on trays, with no brand at medium price. Using a cluster analysis, three market segments were distinguished. The largest (52.3%) placed great importance on origin and preferred the highest price. The second (27.5%) also valued origin with the greatest preference for Argentinean beef, and it was the only group that preferred the ribeye as the cut. The third (20.5%) placed the greatest importance on price, and was the only group that preferred Paraguayan meat. The segments differed in the importance of eating meat for their personal well-being. The low importance of packaging and brand indicates poorly developed marketing of this product. In order to properly insert brand beef in the Chilean market, communication strategies must be implemented that identify the product with superior quality and that position the brand in the consumer's mind.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

We propose a computational methodology -"B-LOG"-, which offers the potential for an effective implementation of Logic Programming in a parallel computer. We also propose a weighting scheme to guide the search process through the graph and we apply the concepts of parallel "branch and bound" algorithms in order to perform a "best-first" search using an information theoretic bound. The concept of "session" is used to speed up the search process in a succession of similar queries. Within a session, we strongly modify the bounds in a local database, while bounds kept in a global database are weakly modified to provide a better initial condition for other sessions. We also propose an implementation scheme based on a database machine using "semantic paging", and the "B-LOG processor" based on a scoreboard driven controller.