32 resultados para Particle-hole asymmetry
em Instituto Politécnico do Porto, Portugal
Resumo:
Within a country-size asymmetric monetary union, idiosyncratic shocks and national fiscal stabilization policies cause asymmetric cross-border effects. These effects are a source of strategic interactions between noncoordinated fiscal and monetary policies: on the one hand, due to larger externalities imposed on the union, large countries face less incentives to develop free-riding fiscal policies; on the other hand, a larger strategic position vis-à-vis the central bank incentives the use of fiscal policy to, deliberately, influence monetary policy. Additionally, the existence of non-distortionary government financing may also shape policy interactions. As a result, optimal policy regimes may diverge not only across the union members, but also between the latter and the monetary union. In a two-country micro-founded New-Keynesian model for a monetary union, we consider two fiscal policy scenarios: (i) lump-sum taxes are raised to fully finance the government budget and (ii) lump-sum taxes do not ensure balanced budgets in each period; therefore, fiscal and monetary policies are expected to impinge on debt sustainability. For several degrees of country-size asymmetry, we compute optimal discretionary and dynamic non-cooperative policy games and compare their stabilization performance using a union-wide welfare measure. We also assess whether these outcomes could be improved, for the monetary union, through institutional policy arrangements. We find that, in the presence of government indebtedness, monetary policy optimally deviates from macroeconomic to debt stabilization. We also find that policy cooperation is always welfare increasing for the monetary union as a whole; however, indebted large countries may strongly oppose to this arrangement in favour of fiscal leadership. In this case, delegation of monetary policy to a conservative central bank proves to be fruitful to improve the union’s welfare.
Resumo:
In the sequence of the recent financial and economic crisis, the recent public debt accumulation is expected to hamper considerably business cycle stabilization, by enlarging the budgetary consequences of the shocks. This paper analyses how the average level of public debt in a monetary union shapes optimal discretionary fiscal and monetary stabilization policies and affects stabilization welfare. We use a two-country micro-founded New-Keynesian model, where a benevolent central bank and the fiscal authorities play discretionary policy games under different union-average debt-constrained scenarios. We find that high debt levels shift monetary policy assignment from inflation to debt stabilization, making cooperation welfare superior to noncooperation. Moreover, when average debt is too high, welfare moves directly (inversely) with debt-to-output ratios for the union and the large country (small country) under cooperation. However, under non-cooperation, higher average debt levels benefit only the large country.
Resumo:
This paper addresses the problem of energy resources management using modern metaheuristics approaches, namely Particle Swarm Optimization (PSO), New Particle Swarm Optimization (NPSO) and Evolutionary Particle Swarm Optimization (EPSO). The addressed problem in this research paper is intended for aggregators’ use operating in a smart grid context, dealing with Distributed Generation (DG), and gridable vehicles intelligently managed on a multi-period basis according to its users’ profiles and requirements. The aggregator can also purchase additional energy from external suppliers. The paper includes a case study considering a 30 kV distribution network with one substation, 180 buses and 90 load points. The distribution network in the case study considers intense penetration of DG, including 116 units from several technologies, and one external supplier. A scenario of 6000 EVs for the given network is simulated during 24 periods, corresponding to one day. The results of the application of the PSO approaches to this case study are discussed deep in the paper.
Resumo:
This paper proposes a particle swarm optimization (PSO) approach to support electricity producers for multiperiod optimal contract allocation. The producer risk preference is stated by a utility function (U) expressing the tradeoff between the expectation and variance of the return. Variance estimation and expected return are based on a forecasted scenario interval determined by a price range forecasting model developed by the authors. A certain confidence level is associated to each forecasted scenario interval. The proposed model makes use of contracts with physical (spot and forward) and financial (options) settlement. PSO performance was evaluated by comparing it with a genetic algorithm-based approach. This model can be used by producers in deregulated electricity markets but can easily be adapted to load serving entities and retailers. Moreover, it can easily be adapted to the use of other type of contracts.
Resumo:
Distributed Energy Resources (DER) scheduling in smart grids presents a new challenge to system operators. The increase of new resources, such as storage systems and demand response programs, results in additional computational efforts for optimization problems. On the other hand, since natural resources, such as wind and sun, can only be precisely forecasted with small anticipation, short-term scheduling is especially relevant requiring a very good performance on large dimension problems. Traditional techniques such as Mixed-Integer Non-Linear Programming (MINLP) do not cope well with large scale problems. This type of problems can be appropriately addressed by metaheuristics approaches. This paper proposes a new methodology called Signaled Particle Swarm Optimization (SiPSO) to address the energy resources management problem in the scope of smart grids, with intensive use of DER. The proposed methodology’s performance is illustrated by a case study with 99 distributed generators, 208 loads, and 27 storage units. The results are compared with those obtained in other methodologies, namely MINLP, Genetic Algorithm, original Particle Swarm Optimization (PSO), Evolutionary PSO, and New PSO. SiPSO performance is superior to the other tested PSO variants, demonstrating its adequacy to solve large dimension problems which require a decision in a short period of time.
Resumo:
Short-term risk management is highly dependent on long-term contractual decisions previously established; risk aversion factor of the agent and short-term price forecast accuracy. Trying to give answers to that problem, this paper provides a different approach for short-term risk management on electricity markets. Based on long-term contractual decisions and making use of a price range forecast method developed by the authors, the short-term risk management tool presented here has as main concern to find the optimal spot market strategies that a producer should have for a specific day in function of his risk aversion factor, with the objective to maximize the profits and simultaneously to practice the hedge against price market volatility. Due to the complexity of the optimization problem, the authors make use of Particle Swarm Optimization (PSO) to find the optimal solution. Results from realistic data, namely from OMEL electricity market, are presented and discussed in detail.
Resumo:
The concept of demand response has a growing importance in the context of the future power systems. Demand response can be seen as a resource like distributed generation, storage, electric vehicles, etc. All these resources require the existence of an infrastructure able to give players the means to operate and use them in an efficient way. This infrastructure implements in practice the smart grid concept, and should accommodate a large number of diverse types of players in the context of a competitive business environment. In this paper, demand response is optimally scheduled jointly with other resources such as distributed generation units and the energy provided by the electricity market, minimizing the operation costs from the point of view of a virtual power player, who manages these resources and supplies the aggregated consumers. The optimal schedule is obtained using two approaches based on particle swarm optimization (with and without mutation) which are compared with a deterministic approach that is used as a reference methodology. A case study with two scenarios implemented in DemSi, a demand Response simulator developed by the authors, evidences the advantages of the use of the proposed particle swarm approaches.
Resumo:
O objectivo desta tese é dimensionar um secador em leito fluidizado para secagem de cereais, nomeadamente, secagem de sementes de trigo. Inicialmente determinaram-se as condições de hidrodinâmica (velocidade de fluidização, TDH, condições mínimas de “slugging”, expansão do leito, dimensionamento do distribuidor e queda de pressão). Com as condições de hidrodinâmica definidas, foi possível estimar as dimensões físicas do secador. Neste ponto, foram realizados estudos relativamente à cinética da secagem e à própria secagem. Foi também estudado o transporte pneumático das sementes. Deste modo, determinaram-se as velocidades necessárias ao transporte pneumático e respectivas quedas de pressão. Por fim, foi realizada uma análise custos para que se soubesse o custo deste sistema de secagem. O estudo da secagem foi feito para uma temperatura de operação de 50ºC, tendo a ressalva que no limite se poderia trabalhar com 60ºC. A velocidade de operação é de 2,43 m/s, a altura do leito fixo é de 0,4 m, a qual sofre uma expansão durante a fluidização, assumindo o valor de 0,79 m. O valor do TDH obtido foi de 1,97 m, que somado à expansão do leito permite obter uma altura total da coluna de 2,76 m. A altura do leito fixo permite retirar o valor do diâmetro que é de 0,52 m. Verifica-se que a altura do leito expandido é inferior à altura mínima de “slugging” (1,20 m), no entanto, a velocidade de operação é superior à velocidade mínima de “slugging” (1,13 m/s). Como só uma das condições mínimas é cumprida, existe a possibilidade da ocorrência de “slugging”. Finalmente, foi necessário dimensionar o distribuidor, que com o diâmetro de orifício de 3 mm, valor inferior ao da partícula (3,48 mm), permite a distruibuição do fluido de secagem na coluna através dos seus 3061 orifícios. O inicio do estudo da secagem centrou-se na determinação do tempo de secagem. Além das duas temperaturas atrás referidas, foram igualmente consideradas duas humidades iniciais para os cereais (21,33% e 18,91%). Temperaturas superiores traduzem-se em tempos de secagem inferiores, paralelamente, teores de humidade inicial inferiores indicam tempos menores. Para a temperatura de 50ºC, os tempos de secagem assumiram os valores de 2,8 horas para a 21,33% de humidade e 2,7 horas para 18,91% de humidade. Foram também tidas em conta três alturas do ano para a captação do ar de secagem, Verão e Inverno representando os extremos, e a Meia- Estação. Para estes três casos, foi possível verificar que a humidade específica do ar não apresenta alterações significativas entre a entrada no secador e a corrente de saída do mesmo equipamento, do mesmo modo que a temperatura de saída pouco difere da de entrada. Este desvio de cerca de 1% para as humidades e para as temperaturas é explicado pela ausência de humidade externa nas sementes e na pouca quantidade de humidade interna. Desta forma, estes desvios de 1% permitem a utilização de uma razão de reciclagem na ordem dos 100% sem que o comportamento da secagem se altere significativamente. O uso de 100% de reciclagem permite uma poupança energética de cerca de 98% no Inverno e na Meia-Estação e de cerca de 93% no Verão. Caso não fosse realizada reciclagem, seria necessário fornecer à corrente de ar cerca de 18,81 kW para elevar a sua temperatura de 20ºC para 50ºC (Meia-Estação), cerca de 24,67 kW para elevar a sua temperatura de 10ºC para 50ºC (Inverno) e na ordem dos 8,90 kW para elevar a sua temperatura dos 35ºC para 50ºC (Verão). No caso do transporte pneumático, existem duas linhas, uma horizontal e uma vertical, logo foi necessário estimar o valor da velocidade das partículas para estes dois casos. Na linha vertical, a velocidade da partícula é cerca de 25,03 m/s e cerca de 35,95 m/s na linha horizontal. O menor valor para a linha vertical prende-se com o facto de nesta zona ter que se vencer a força gravítica. Em ambos os circuitos a velocidade do fluido é cerca de 47,17 m/s. No interior da coluna, a velocidade do fluido tem o valor de 10,90 m/s e a velocidade das partículas é de 1,04 m/s. A queda de pressão total no sistema é cerca de 2408 Pa. A análise de custos ao sistema de secagem indicou que este sistema irá acarretar um custo total (fabrico mais transporte) de cerca de 153035€. Este sistema necessita de electricidade para funcionar, e esta irá acarretar um custo anual de cerca de 7951,4€. Embora este sistema de secagem apresente a possibilidade de se realizar uma razão de reciclagem na ordem dos 100% e também seja possível adaptar o mesmo para diferentes tipos de cereais, e até outros tipos de materiais, desde que possam ser fluidizados, o seu custo impede que a realização deste investimento não seja atractiva, especialmente tendo em consideração que se trata de uma instalação à escala piloto com uma capacidade de 45 kgs.
Resumo:
Multi-objective particle swarm optimization (MOPSO) is a search algorithm based on social behavior. Most of the existing multi-objective particle swarm optimization schemes are based on Pareto optimality and aim to obtain a representative non-dominated Pareto front for a given problem. Several approaches have been proposed to study the convergence and performance of the algorithm, particularly by accessing the final results. In the present paper, a different approach is proposed, by using Shannon entropy to analyzethe MOPSO dynamics along the algorithm execution. The results indicate that Shannon entropy can be used as an indicator of diversity and convergence for MOPSO problems.
Resumo:
Competitive electricity markets have arisen as a result of power-sector restructuration and power-system deregulation. The players participating in competitive electricity markets must define strategies and make decisions using all the available information and business opportunities.
Resumo:
This paper presents a modified Particle Swarm Optimization (PSO) methodology to solve the problem of energy resources management with high penetration of distributed generation and Electric Vehicles (EVs) with gridable capability (V2G). The objective of the day-ahead scheduling problem in this work is to minimize operation costs, namely energy costs, regarding he management of these resources in the smart grid context. The modifications applied to the PSO aimed to improve its adequacy to solve the mentioned problem. The proposed Application Specific Modified Particle Swarm Optimization (ASMPSO) includes an intelligent mechanism to adjust velocity limits during the search process, as well as self-parameterization of PSO parameters making it more user-independent. It presents better robustness and convergence characteristics compared with the tested PSO variants as well as better constraint handling. This enables its use for addressing real world large-scale problems in much shorter times than the deterministic methods, providing system operators with adequate decision support and achieving efficient resource scheduling, even when a significant number of alternative scenarios should be considered. The paper includes two realistic case studies with different penetration of gridable vehicles (1000 and 2000). The proposed methodology is about 2600 times faster than Mixed-Integer Non-Linear Programming (MINLP) reference technique, reducing the time required from 25 h to 36 s for the scenario with 2000 vehicles, with about one percent of difference in the objective function cost value.
Resumo:
This paper studies a discrete dynamical system of interacting particles that evolve by interacting among them. The computational model is an abstraction of the natural world, and real systems can range from the huge cosmological scale down to the scale of biological cell, or even molecules. Different conditions for the system evolution are tested. The emerging patterns are analysed by means of fractal dimension and entropy measures. It is observed that the population of particles evolves towards geometrical objects with a fractal nature. Moreover, the time signature of the entropy can be interpreted at the light of complex dynamical systems.
Resumo:
This manuscript analyses the data generated by a Zero Length Column (ZLC) diffusion experimental set-up, for 1,3 Di-isopropyl benzene in a 100% alumina matrix with variable particle size. The time evolution of the phenomena resembles those of fractional order systems, namely those with a fast initial transient followed by long and slow tails. The experimental measurements are best fitted with the Harris model revealing a power law behavior.
Resumo:
Collective behaviours can be observed in both natural and man-made systems composed of a large number of elemental subsystems. Typically, each elemental subsystem has its own dynamics but, whenever interaction between individuals occurs, the individual behaviours tend to be relaxed, and collective behaviours emerge. In this paper, the collective behaviour of a large-scale system composed of several coupled elemental particles is analysed. The dynamics of the particles are governed by the same type of equations but having different parameter values and initial conditions. Coupling between particles is based on statistical feedback, which means that each particle is affected by the average behaviour of its neighbours. It is shown that the global system may unveil several types of collective behaviours, corresponding to partial synchronisation, characterised by the existence of several clusters of synchronised subsystems, and global synchronisation between particles, where all the elemental particles synchronise completely.