968 resultados para Efficient edge dominating set


Relevância:

20.00% 20.00%

Publicador:

Resumo:

A competitividade entre as empresas e a busca por modelos de gestão e organização cada vez mais eficientes, tem dominado a actualidade. A filosofia de gestão Lean vem dar resposta a essas necessidades de aumento dos níveis de competitividade e eficiência, através de uma mudança da cultura organizacional, que assenta na redução ou eliminação de desperdícios e na melhoria contínua dos processos de fabrico de bens ou do fornecimento de serviços. A gestão Lean é suportada e implementada pela aplicação de um conjunto de ferramentas correctamente seleccionadas e adaptadas ao contexto organizacional da empresa ou organização. A presente dissertação visa caracterizar as ferramentas mais comuns da filosofia Lean, tendo em consideração a sua aplicabilidade na indústria e no sector dos serviços. É igualmente abordada a forma de aplicação das ferramentas Lean de maneira a que não constituam um acto isolado que conduz seguramente ao fracasso da implementação Lean na organização. Por essa razão são discutidas algumas regras e critérios, com base na proposta de um método de aplicação das ferramentas Lean que evite erros cometidos no passado e que levaram ao insucesso da aplicação do Lean em algumas organizações. Recorreu-se a um estudo de caso do ramo dos serviços, cujos resultados permitiram verificar a aplicabilidade do método proposto na aplicação de ferramentas Lean ao ramo dos serviços. O estudo de caso revelou a existência de uma elevada percentagem de desperdícios no processo em análise e permitiu melhorar o funcionamento desses mesmos processos. As melhorias alcançadas foram realizadas com base na eliminação dos desperdícios, na resolução de problemas e consequente uniformização de processos que melhoraram a qualidade e eficiência do serviço prestado, evidenciando que a organização alvo do estudo se encontra no bom caminho para atingir com sucesso a alteração da cultura organizacional para a filosofia Lean.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

We have studied, in particular under normality of the implied random variables, the connections between different measures of risk such as the standard deviation, the W-ruin probability and the p-V@R. We discuss conditions granting the equivalence of these measures with respect to risk preference relations and the equivalence of dominance and efficiency of risk-reward criteria involving these measures. Then more specifically we applied these concepts to rigorously face the problem of finding the efficient set of de Finetti’s variable quota share proportional reinsurance.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Object-oriented programming languages presently are the dominant paradigm of application development (e. g., Java,. NET). Lately, increasingly more Java applications have long (or very long) execution times and manipulate large amounts of data/information, gaining relevance in fields related with e-Science (with Grid and Cloud computing). Significant examples include Chemistry, Computational Biology and Bio-informatics, with many available Java-based APIs (e. g., Neobio). Often, when the execution of such an application is terminated abruptly because of a failure (regardless of the cause being a hardware of software fault, lack of available resources, etc.), all of its work already performed is simply lost, and when the application is later re-initiated, it has to restart all its work from scratch, wasting resources and time, while also being prone to another failure and may delay its completion with no deadline guarantees. Our proposed solution to address these issues is through incorporating mechanisms for checkpointing and migration in a JVM. These make applications more robust and flexible by being able to move to other nodes, without any intervention from the programmer. This article provides a solution to Java applications with long execution times, by extending a JVM (Jikes research virtual machine) with such mechanisms. Copyright (C) 2011 John Wiley & Sons, Ltd.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Orientada por: Prof. Doutora Cláudia Lopes

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Power systems have been through deep changes in recent years, namely with the operation of competitive electricity markets in the scope and the increasingly intensive use of renewable energy sources and distributed generation. This requires new business models able to cope with the new opportunities that have emerged. Virtual Power Players (VPPs) are a new player type which allows aggregating a diversity of players (Distributed Generation (DG), Storage Agents (SA), Electrical Vehicles, (V2G) and consumers), to facilitate their participation in the electricity markets and to provide a set of new services promoting generation and consumption efficiency, while improving players` benefits. A major task of VPPs is the remuneration of generation and services (maintenance, market operation costs and energy reserves), as well as charging energy consumption. This paper proposes a model to implement fair and strategic remuneration and tariff methodologies, able to allow efficient VPP operation and VPP goals accomplishment in the scope of electricity markets.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper proposes a computationally efficient methodology for the optimal location and sizing of static and switched shunt capacitors in large distribution systems. The problem is formulated as the maximization of the savings produced by the reduction in energy losses and the avoided costs due to investment deferral in the expansion of the network. The proposed method selects the nodes to be compensated, as well as the optimal capacitor ratings and their operational characteristics, i.e. fixed or switched. After an appropriate linearization, the optimization problem was formulated as a large-scale mixed-integer linear problem, suitable for being solved by means of a widespread commercial package. Results of the proposed optimizing method are compared with another recent methodology reported in the literature using two test cases: a 15-bus and a 33-bus distribution network. For the both cases tested, the proposed methodology delivers better solutions indicated by higher loss savings, which are achieved with lower amounts of capacitive compensation. The proposed method has also been applied for compensating to an actual large distribution network served by AES-Venezuela in the metropolitan area of Caracas. A convergence time of about 4 seconds after 22298 iterations demonstrates the ability of the proposed methodology for efficiently handling large-scale compensation problems.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In recent years, Power Systems (PS) have experimented many changes in their operation. The introduction of new players managing Distributed Generation (DG) units, and the existence of new Demand Response (DR) programs make the control of the system a more complex problem and allow a more flexible management. An intelligent resource management in the context of smart grids is of huge important so that smart grids functions are assured. This paper proposes a new methodology to support system operators and/or Virtual Power Players (VPPs) to determine effective and efficient DR programs that can be put into practice. This method is based on the use of data mining techniques applied to a database which is obtained for a large set of operation scenarios. The paper includes a case study based on 27,000 scenarios considering a diversity of distributed resources in a 32 bus distribution network.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A novel agent-based approach to Meta-Heuristics self-configuration is proposed in this work. Meta-heuristics are examples of algorithms where parameters need to be set up as efficient as possible in order to unsure its performance. This paper presents a learning module for self-parameterization of Meta-heuristics (MHs) in a Multi-Agent System (MAS) for resolution of scheduling problems. The learning is based on Case-based Reasoning (CBR) and two different integration approaches are proposed. A computational study is made for comparing the two CBR integration perspectives. In the end, some conclusions are reached and future work outlined.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The new hexanuclear mixed-valence vanadium complex [V3O3(OEt)(ashz)(2)(mu-OEt)](2) (1) with an N,O-donor ligand is reported. It acts as a highly efficient catalyst toward alkane oxidations by aqueous H2O2. Remarkably, high turnover numbers up to 25000 with product yields of up to 27% (based on alkane) stand for one of the most active systems for such reactions.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Copyright © 2013 Springer Netherlands.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Mestrado em Engenharia Electrotécnica e de Computadores

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The hydrotris(pyrazol-1-yl)methane iron(II) complex [FeCl2{eta(3)-HC(pz)(3)}] (Fe, pz = pyrazol-1-yl) immobilized on commercial (MOR) or desilicated (MOR-D) zeolite, catalyses the oxidation of cyclohexane with hydrogen peroxide to cyclohexanol and cyclohexanone, under mild conditions. MOR-D/Fe (desilicated zeolite supported [FeCl2{eta(3)-HC(pz)(3)}] complex) provides an outstanding catalytic activity (TON up to 2.90 x 10(3)) with the concomitant overall yield of 38%, and can be easy recovered and reused. The MOR or MOR-D supported hydrotris(pyrazol-1-yl)methane iron(II) complex (MOR/Fe and MOR-D/Fe, respectively) was characterized by X-ray powder diffraction, ICP-AES, and TEM studies as well as by IR spectroscopy and N-2 adsorption at -196 degrees C. The catalytic operational conditions (e.g., reaction time, type and amount of oxidant, presence of acid and type of solvent) were optimized. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Video coding technologies have played a major role in the explosion of large market digital video applications and services. In this context, the very popular MPEG-x and H-26x video coding standards adopted a predictive coding paradigm, where complex encoders exploit the data redundancy and irrelevancy to 'control' much simpler decoders. This codec paradigm fits well applications and services such as digital television and video storage where the decoder complexity is critical, but does not match well the requirements of emerging applications such as visual sensor networks where the encoder complexity is more critical. The Slepian Wolf and Wyner-Ziv theorems brought the possibility to develop the so-called Wyner-Ziv video codecs, following a different coding paradigm where it is the task of the decoder, and not anymore of the encoder, to (fully or partly) exploit the video redundancy. Theoretically, Wyner-Ziv video coding does not incur in any compression performance penalty regarding the more traditional predictive coding paradigm (at least for certain conditions). In the context of Wyner-Ziv video codecs, the so-called side information, which is a decoder estimate of the original frame to code, plays a critical role in the overall compression performance. For this reason, much research effort has been invested in the past decade to develop increasingly more efficient side information creation methods. This paper has the main objective to review and evaluate the available side information methods after proposing a classification taxonomy to guide this review, allowing to achieve more solid conclusions and better identify the next relevant research challenges. After classifying the side information creation methods into four classes, notably guess, try, hint and learn, the review of the most important techniques in each class and the evaluation of some of them leads to the important conclusion that the side information creation methods provide better rate-distortion (RD) performance depending on the amount of temporal correlation in each video sequence. It became also clear that the best available Wyner-Ziv video coding solutions are almost systematically based on the learn approach. The best solutions are already able to systematically outperform the H.264/AVC Intra, and also the H.264/AVC zero-motion standard solutions for specific types of content. (C) 2013 Elsevier B.V. All rights reserved.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

A CIF é um sistema de classificação adotado pela OMS, que serve de referência universal para descrever, avaliar e medir saúde e incapacidade, a nível individual e ao nível da população. Contudo, apesar do interesse internacional gerado em torno da CIF, esta é considerada uma classificação complexa e extensa, fato que despoletou a criação de core sets – listas de itens da CIF especificamente selecionados pela sua relevância na descrição e qualificação de uma determinada condição de saúde – como resposta a esta problemática. Até à data, foram desenvolvidos core sets para várias patologias comuns. Contudo, apesar do controlo motor ser uma área de investigação muito reconhecida nos últimos 20 anos, ainda não possui um core set próprio. Assim, o objetivo deste estudo é contribuir para o desenvolvimento de um core set, com base na CIF-CJ, dirigido para uma descrição abrangente das competências inerentes a crianças, dos 6 aos 18 anos de idade, com défices no controlo motor. Deste modo, recorreu-se a uma revisão da literatura sobre a temática em estudo, de modo a reunir informação para a construção de uma proposta a core set, posteriormente sujeita ao escrutínio de peritos, através do recurso ao método de Delphi. Após várias rondas, foi alcançado um consenso acerca da lista final de códigos CIF que constituem o core set final.