47 resultados para Initial solution

em Instituto Politécnico do Porto, Portugal


Relevância:

100.00% 100.00%

Publicador:

Resumo:

An intensive use of dispersed energy resources is expected for future power systems, including distributed generation, especially based on renewable sources, and electric vehicles. The system operation methods and tool must be adapted to the increased complexity, especially the optimal resource scheduling problem. Therefore, the use of metaheuristics is required to obtain good solutions in a reasonable amount of time. This paper proposes two new heuristics, called naive electric vehicles charge and discharge allocation and generation tournament based on cost, developed to obtain an initial solution to be used in the energy resource scheduling methodology based on simulated annealing previously developed by the authors. The case study considers two scenarios with 1000 and 2000 electric vehicles connected in a distribution network. The proposed heuristics are compared with a deterministic approach and presenting a very small error concerning the objective function with a low execution time for the scenario with 2000 vehicles.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to the growing complexity and adaptability requirements of real-time systems, which often exhibit unrestricted Quality of Service (QoS) inter-dependencies among supported services and user-imposed quality constraints, it is increasingly difficult to optimise the level of service of a dynamic task set within an useful and bounded time. This is even more difficult when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand and tasks may be inter-dependent. This paper focuses on optimising a dynamic local set of inter-dependent tasks that can be executed at varying levels of QoS to achieve an efficient resource usage that is constantly adapted to the specific constraints of devices and users, nature of executing tasks and dynamically changing system conditions. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves as the algorithms are given more time to run, with a minimum overhead when compared against their traditional versions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to the growing complexity and dynamism of many embedded application domains (including consumer electronics, robotics, automotive and telecommunications), it is increasingly difficult to react to load variations and adapt the system's performance in a controlled fashion within an useful and bounded time. This is particularly noticeable when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand and tasks may exhibit unrestricted QoS inter-dependencies. This paper proposes a novel anytime adaptive QoS control policy in which the online search for the best set of QoS levels is combined with each user's personal preferences on their services' adaptation behaviour. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves as the algorithms are given more time to run, with a minimum overhead when compared against their traditional versions.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Due to the growing complexity and adaptability requirements of real-time embedded systems, which often exhibit unrestricted inter-dependencies among supported services and user-imposed quality constraints, it is increasingly difficult to optimise the level of service of a dynamic task set within an useful and bounded time. This is even more difficult when intending to benefit from the full potential of an open distributed cooperating environment, where service characteristics are not known beforehand. This paper proposes an iterative refinement approach for a service’s QoS configuration taking into account services’ inter-dependencies and quality constraints, and trading off the achieved solution’s quality for the cost of computation. Extensive simulations demonstrate that the proposed anytime algorithm is able to quickly find a good initial solution and effectively optimises the rate at which the quality of the current solution improves as the algorithm is given more time to run. The added benefits of the proposed approach clearly surpass its reducedoverhead.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In heterogeneous environments, diversity of resources among the devices may affect their ability to perform services with specific QoS constraints, and drive peers to group themselves in a coalition for cooperative service execution. The dynamic selection of peers should be influenced by user’s QoS requirements as well as local computation availability, tailoring provided service to user’s specific needs. However, complex dynamic real-time scenarios may prevent the possibility of computing optimal service configurations before execution. An iterative refinement approach with the ability to trade off deliberation time for the quality of the solution is proposed. We state the importance of quickly finding a good initial solution and propose heuristic evaluation functions that optimise the rate at which the quality of the current solution improves as the algorithms have more time to run.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

The scarcity and diversity of resources among the devices of heterogeneous computing environments may affect their ability to perform services with specific Quality of Service constraints, particularly in dynamic distributed environments where the characteristics of the computational load cannot always be predicted in advance. Our work addresses this problem by allowing resource constrained devices to cooperate with more powerful neighbour nodes, opportunistically taking advantage of global distributed resources and processing power. Rather than assuming that the dynamic configuration of this cooperative service executes until it computes its optimal output, the paper proposes an anytime approach that has the ability to tradeoff deliberation time for the quality of the solution. Extensive simulations demonstrate that the proposed anytime algorithms are able to quickly find a good initial solution and effectively optimise the rate at which the quality of the current solution improves at each iteration, with an overhead that can be considered negligible.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Este artigo apresenta uma nova abordagem (MM-GAV-FBI), aplicável ao problema da programação de projectos com restrições de recursos e vários modos de execução por actividade, problema conhecido na literatura anglo-saxónica por MRCPSP. Cada projecto tem um conjunto de actividades com precedências tecnológicas definidas e um conjunto de recursos limitados, sendo que cada actividade pode ter mais do que um modo de realização. A programação dos projectos é realizada com recurso a um esquema de geração de planos (do inglês Schedule Generation Scheme - SGS) integrado com uma metaheurística. A metaheurística é baseada no paradigma dos algoritmos genéticos. As prioridades das actividades são obtidas a partir de um algoritmo genético. A representação cromossómica utilizada baseia-se em chaves aleatórias. O SGS gera planos não-atrasados. Após a obtenção de uma solução é aplicada uma melhoria local. O objectivo da abordagem é encontrar o melhor plano (planning), ou seja, o plano que tenha a menor duração temporal possível, satisfazendo as precedências das actividades e as restrições de recursos. A abordagem proposta é testada num conjunto de problemas retirados da literatura da especialidade e os resultados computacionais são comparados com outras abordagens. Os resultados computacionais validam o bom desempenho da abordagem, não apenas em termos de qualidade da solução, mas também em termos de tempo útil.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In a liberalized electricity market, the Transmission System Operator (TSO) plays a crucial role in power system operation. Among many other tasks, TSO detects congestion situations and allocates the payments of electricity transmission. This paper presents a software tool for congestion management and transmission price determination in electricity markets. The congestion management is based on a reformulated Optimal Power Flow (OPF), whose main goal is to obtain a feasible solution for the re-dispatch minimizing the changes in the dispatch proposed by the market operator. The transmission price computation considers the physical impact caused by the market agents in the transmission network. The final tariff includes existing system costs and also costs due to the initial congestion situation and losses costs. The paper includes a case study for the IEEE 30 bus power system.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Congestion management of transmission power systems has achieve high relevance in competitive environments, which require an adequate approach both in technical and economic terms. This paper proposes a new methodology for congestion management and transmission tariff determination in deregulated electricity markets. The congestion management methodology is based on a reformulated optimal power flow, whose main goal is to obtain a feasible solution for the re-dispatch minimizing the changes in the transactions resulting from market operation. The proposed transmission tariffs consider the physical impact caused by each market agents in the transmission network. The final tariff considers existing system costs and also costs due to the initial congestion situation and losses. This paper includes a case study for the 118 bus IEEE test case.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Introduction/Aims: The purpose of the study is to evaluate the perception of the organization, the development and the evaluation of the initial stage in the internship of students, in order to improve these activities and to establish the adequate objectives in accordance with the changes concerning the concept of modern pharmacy. Materials and methods: An online survey was made using Google Docs ® -Create Form extension. All results were accumulated and computed using Microsoft Excel ®. The questionnaire consisted of 11 questions, structured on several levels: the objectives and how they can be achieved, internship organization, the internship training (effective participation in specific activities and integration in the pharmaceutical activity), the assessment, the profile of tutor / pharmacy. The questionnaire was completed by students from the Faculty of Pharmacy, University of Medicine and Pharmacy "Iuliu Haţieganu" Cluj Napoca, Romania. Results and discussions. The study was conducted on 308 students (60% of all students from the study years II-IV. 90% of the respondents had actually participated in the internship, whilst 10% only formally participated in this activity. The main responsibilities of the students were: storage and reception of pharmaceutical products (94%, respectively 79%) and working with the receipts (57%). Most of the students appreciate that they were integrated into the work in the pharmacy, this being due largely pharmacist tutor, who expressed interest and ability in mentoring activities. They appreciated that the role of tutor requires 3-5 years of professional experience. In terms of the internship objectives, these should aim at applying the knowledge gained until the graduation year, but also familiarization with activities which might turn into applications for the coming years. 43% of students believe that only 25% of the theoretical knowledge was useful during the internship. 90 % of the total questioned considered useful to develop a practice guideline adapted to the year of study. Conclusions. The professional training of the future pharmacist’s students depends largely on experience gained by students during the internship activity. Feed-back from the students’ shows that they are aware of the usefulness of the internship, but believe the objectives must be updated and a better correlation between work in pharmacy and theoretical knowledge has to be made. A first step is to develop a practical guide adapted to each year of study. The involvement of the tutor pharmacist is also essential to the success of this activity

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In real optimization problems, usually the analytical expression of the objective function is not known, nor its derivatives, or they are complex. In these cases it becomes essential to use optimization methods where the calculation of the derivatives, or the verification of their existence, is not necessary: the Direct Search Methods or Derivative-free Methods are one solution. When the problem has constraints, penalty functions are often used. Unfortunately the choice of the penalty parameters is, frequently, very difficult, because most strategies for choosing it are heuristics strategies. As an alternative to penalty function appeared the filter methods. A filter algorithm introduces a function that aggregates the constrained violations and constructs a biobjective problem. In this problem the step is accepted if it either reduces the objective function or the constrained violation. This implies that the filter methods are less parameter dependent than a penalty function. In this work, we present a new direct search method, based on simplex methods, for general constrained optimization that combines the features of the simplex method and filter methods. This method does not compute or approximate any derivatives, penalty constants or Lagrange multipliers. The basic idea of simplex filter algorithm is to construct an initial simplex and use the simplex to drive the search. We illustrate the behavior of our algorithm through some examples. The proposed methods were implemented in Java.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Purpose: The sorption of sulfamethoxazole, a frequently detected pharmaceutical compound in the environment, onto walnut shells was evaluated. Methods: The sorption proprieties of the raw sorbent were chemically modified and two additional samples were obtained, respectively HCl and NaOH treated. Scanning electron microscopy, Fourier transform infrared spectroscopy, X-ray photoelectron spectroscopy, and thermogravimetric (TG/DTG) techniques were applied to investigate the effect of the chemical treatments on the shell surface morphology and chemistry. Sorption experiments to investigate the pH effect on the process were carried out between pH 2 and 8. Results: The chemical treatment did not substantially alter the structure of the sorbent (physical and textural characteristics) but modified the surface chemistry of the sorbent (acid–base properties, point of zero charge—pHpzc). The solution pH influences both the sorbent’s surface charge and sulfamethoxazole speciation. The best removal efficiencies were obtained for lower pH values where the neutral and cationic sulfamethoxazole forms are present in the solution. Langmuir and Freundlich isotherms were applied to the experimental adsorption data for sulfamethoxazole sorption at pH 2, 4, and 7 onto raw walnut shell. No statistical difference was found between the two models except for the pH 2 experimental data to which the Freundlich model fitted better. Conclusion: Sorption of sulfamethoxazole was found to be highly pH dependent in the entire pH range studied and for both raw and treated sorbent.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

O presente trabalho teve como principais objectivos, estudar e optimizar o processo de tratamento do efluente proveniente das máquinas da unidade Cold-press da linha de produção da Empresa Swedwood, caracterizar a solução límpida obtida no tratamento e estudar a sua integração no processo, e por fim caracterizar o resíduo de pasta de cola obtido no tratamento e estudar a possível valorização energética deste resíduo. Após caracterização inicial do efluente e de acordo com os resultados de um estudo prévio solicitado pela Empresa Swedwood a uma empresa externa, decidiu-se iniciar o estudo de tratabilidade do efluente pelo processo físico-químico a coagulação/floculação. No processo de coagulação/floculação estudou-se a aplicabilidade, através de ensaios Jar-test, dos diferentes agentes de coagulação/floculação: a soda cáustica, a cal, o cloreto férrico e o sulfato de alumínio. Os melhores resultados neste processo foram obtidos com a adição de uma dose de cal de 500 mg/Lefluente, seguida da adição de 400 mg/Lefluente de sulfato de alumínio. Contudo, após este tratamento o clarificado obtido não possuía as características necessárias para a sua reintrodução no processo fabril nem para a sua descarga em meio hídrico. Deste modo procedeu-se ao estudo de tratamentos complementares. Nesta segunda fases de estudo testaram-se os seguintes os tratamentos: a oxidação química por Reagente de Fenton, o tratamento biológico por SBR (sequencing batch reactor) e o leito percolador. Da análise dos resultados obtidos nos diferentes tratamentos conclui-se que o tratamento mais eficaz foi o tratamento biológico por SBR com adição de carvão activado. Prevê-se que no final do processo de tratamento o clarificado obtido possa ser descarregado em meio hídrico ou reintroduzido no processo. Como o estudo apenas foi desenvolvido à escala laboratorial, seria útil poder validar os resultados numa escala piloto antes da sua implementação industrial. A partir dos resultados do estudo experimental, procedeu-se ao dimensionamento de uma unidade de tratamento físico-químico e biológico à escala industrial para o tratamento de 20 m3 de efluente produzido na fábrica, numa semana. Dimensionou-se ainda a unidade (leito de secagem) para tratamento das lamas produzidas. Na unidade de tratamento físico-químico (coagulação/floculação) os decantadores estáticos devem possuir o volume útil de 4,8 m3. Sendo necessários semanalmente 36 L da suspensão de cal (Neutrolac 300) e 12,3 L da solução de sulfato de alumínio a 8,3%. Os tanques de armazenamento destes compostos devem possuir 43,2 litros e 96 litros, respectivamente. Nesta unidade estimou-se que são produzidos diariamente 1,4 m3 de lamas. Na unidade de tratamento biológico o reactor biológico deve possuir um volume útil de 6 m3. Para que este processo seja eficaz é necessário fornecer diariamente 2,1 kg de oxigénio. Estima-se que neste processo será necessário efectuar a purga de 325 litros de lamas semanalmente. No final da purga repõe-se o carvão activado, que poderá ser arrastado juntamente com as lamas, adicionando-se 100 mg de carvão por litro de licor misto. De acordo com o volume de lamas produzidos em ambos os tratamentos a área mínima necessária para o leito de secagem é de cerca de 27 m2. A análise económica efectuada mostra que a aquisição do equipamento tem o custo de 22.079,50 euros, o custo dos reagentes necessários neste processo para um ano de funcionamento tem um custo total de 508,50 euros e as necessidades energéticas de 2.008,45 euros.