959 resultados para efficient algorithms
Resumo:
Object-oriented programming languages presently are the dominant paradigm of application development (e. g., Java,. NET). Lately, increasingly more Java applications have long (or very long) execution times and manipulate large amounts of data/information, gaining relevance in fields related with e-Science (with Grid and Cloud computing). Significant examples include Chemistry, Computational Biology and Bio-informatics, with many available Java-based APIs (e. g., Neobio). Often, when the execution of such an application is terminated abruptly because of a failure (regardless of the cause being a hardware of software fault, lack of available resources, etc.), all of its work already performed is simply lost, and when the application is later re-initiated, it has to restart all its work from scratch, wasting resources and time, while also being prone to another failure and may delay its completion with no deadline guarantees. Our proposed solution to address these issues is through incorporating mechanisms for checkpointing and migration in a JVM. These make applications more robust and flexible by being able to move to other nodes, without any intervention from the programmer. This article provides a solution to Java applications with long execution times, by extending a JVM (Jikes research virtual machine) with such mechanisms. Copyright (C) 2011 John Wiley & Sons, Ltd.
Resumo:
The advances made in channel-capacity codes, such as turbo codes and low-density parity-check (LDPC) codes, have played a major role in the emerging distributed source coding paradigm. LDPC codes can be easily adapted to new source coding strategies due to their natural representation as bipartite graphs and the use of quasi-optimal decoding algorithms, such as belief propagation. This paper tackles a relevant scenario in distributedvideo coding: lossy source coding when multiple side information (SI) hypotheses are available at the decoder, each one correlated with the source according to different correlation noise channels. Thus, it is proposed to exploit multiple SI hypotheses through an efficient joint decoding technique withmultiple LDPC syndrome decoders that exchange information to obtain coding efficiency improvements. At the decoder side, the multiple SI hypotheses are created with motion compensated frame interpolation and fused together in a novel iterative LDPC based Slepian-Wolf decoding algorithm. With the creation of multiple SI hypotheses and the proposed decoding algorithm, bitrate savings up to 8.0% are obtained for similar decoded quality.
Resumo:
O documento em anexo encontra-se na versão post-print (versão corrigida pelo editor).
Resumo:
A manufacturing system has a natural dynamic nature observed through several kinds of random occurrences and perturbations on working conditions and requirements over time. For this kind of environment it is important the ability to efficient and effectively adapt, on a continuous basis, existing schedules according to the referred disturbances, keeping performance levels. The application of Meta-Heuristics and Multi-Agent Systems to the resolution of this class of real world scheduling problems seems really promising. This paper presents a prototype for MASDScheGATS (Multi-Agent System for Distributed Manufacturing Scheduling with Genetic Algorithms and Tabu Search).
Resumo:
This paper proposes a computationally efficient methodology for the optimal location and sizing of static and switched shunt capacitors in large distribution systems. The problem is formulated as the maximization of the savings produced by the reduction in energy losses and the avoided costs due to investment deferral in the expansion of the network. The proposed method selects the nodes to be compensated, as well as the optimal capacitor ratings and their operational characteristics, i.e. fixed or switched. After an appropriate linearization, the optimization problem was formulated as a large-scale mixed-integer linear problem, suitable for being solved by means of a widespread commercial package. Results of the proposed optimizing method are compared with another recent methodology reported in the literature using two test cases: a 15-bus and a 33-bus distribution network. For the both cases tested, the proposed methodology delivers better solutions indicated by higher loss savings, which are achieved with lower amounts of capacitive compensation. The proposed method has also been applied for compensating to an actual large distribution network served by AES-Venezuela in the metropolitan area of Caracas. A convergence time of about 4 seconds after 22298 iterations demonstrates the ability of the proposed methodology for efficiently handling large-scale compensation problems.
Resumo:
The large increase of Distributed Generation (DG) in Power Systems (PS) and specially in distribution networks makes the management of distribution generation resources an increasingly important issue. Beyond DG, other resources such as storage systems and demand response must be managed in order to obtain more efficient and “green” operation of PS. More players, such as aggregators or Virtual Power Players (VPP), that operate these kinds of resources will be appearing. This paper proposes a new methodology to solve the distribution network short term scheduling problem in the Smart Grid context. This methodology is based on a Genetic Algorithms (GA) approach for energy resource scheduling optimization and on PSCAD software to obtain realistic results for power system simulation. The paper includes a case study with 99 distributed generators, 208 loads and 27 storage units. The GA results for the determination of the economic dispatch considering the generation forecast, storage management and load curtailment in each period (one hour) are compared with the ones obtained with a Mixed Integer Non-Linear Programming (MINLP) approach.
Resumo:
Intensive use of Distributed Generation (DG) represents a change in the paradigm of power systems operation making small-scale energy generation and storage decision making relevant for the whole system. This paradigm led to the concept of smart grid for which an efficient management, both in technical and economic terms, should be assured. This paper presents a new approach to solve the economic dispatch in smart grids. The proposed methodology for resource management involves two stages. The first one considers fuzzy set theory to define the natural resources range forecast as well as the load forecast. The second stage uses heuristic optimization to determine the economic dispatch considering the generation forecast, storage management and demand response
Resumo:
A novel agent-based approach to Meta-Heuristics self-configuration is proposed in this work. Meta-heuristics are examples of algorithms where parameters need to be set up as efficient as possible in order to unsure its performance. This paper presents a learning module for self-parameterization of Meta-heuristics (MHs) in a Multi-Agent System (MAS) for resolution of scheduling problems. The learning is based on Case-based Reasoning (CBR) and two different integration approaches are proposed. A computational study is made for comparing the two CBR integration perspectives. In the end, some conclusions are reached and future work outlined.
Resumo:
The new hexanuclear mixed-valence vanadium complex [V3O3(OEt)(ashz)(2)(mu-OEt)](2) (1) with an N,O-donor ligand is reported. It acts as a highly efficient catalyst toward alkane oxidations by aqueous H2O2. Remarkably, high turnover numbers up to 25000 with product yields of up to 27% (based on alkane) stand for one of the most active systems for such reactions.
Resumo:
Mestrado em Engenharia Electrotécnica e de Computadores
Resumo:
The hydrotris(pyrazol-1-yl)methane iron(II) complex [FeCl2{eta(3)-HC(pz)(3)}] (Fe, pz = pyrazol-1-yl) immobilized on commercial (MOR) or desilicated (MOR-D) zeolite, catalyses the oxidation of cyclohexane with hydrogen peroxide to cyclohexanol and cyclohexanone, under mild conditions. MOR-D/Fe (desilicated zeolite supported [FeCl2{eta(3)-HC(pz)(3)}] complex) provides an outstanding catalytic activity (TON up to 2.90 x 10(3)) with the concomitant overall yield of 38%, and can be easy recovered and reused. The MOR or MOR-D supported hydrotris(pyrazol-1-yl)methane iron(II) complex (MOR/Fe and MOR-D/Fe, respectively) was characterized by X-ray powder diffraction, ICP-AES, and TEM studies as well as by IR spectroscopy and N-2 adsorption at -196 degrees C. The catalytic operational conditions (e.g., reaction time, type and amount of oxidant, presence of acid and type of solvent) were optimized. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Video coding technologies have played a major role in the explosion of large market digital video applications and services. In this context, the very popular MPEG-x and H-26x video coding standards adopted a predictive coding paradigm, where complex encoders exploit the data redundancy and irrelevancy to 'control' much simpler decoders. This codec paradigm fits well applications and services such as digital television and video storage where the decoder complexity is critical, but does not match well the requirements of emerging applications such as visual sensor networks where the encoder complexity is more critical. The Slepian Wolf and Wyner-Ziv theorems brought the possibility to develop the so-called Wyner-Ziv video codecs, following a different coding paradigm where it is the task of the decoder, and not anymore of the encoder, to (fully or partly) exploit the video redundancy. Theoretically, Wyner-Ziv video coding does not incur in any compression performance penalty regarding the more traditional predictive coding paradigm (at least for certain conditions). In the context of Wyner-Ziv video codecs, the so-called side information, which is a decoder estimate of the original frame to code, plays a critical role in the overall compression performance. For this reason, much research effort has been invested in the past decade to develop increasingly more efficient side information creation methods. This paper has the main objective to review and evaluate the available side information methods after proposing a classification taxonomy to guide this review, allowing to achieve more solid conclusions and better identify the next relevant research challenges. After classifying the side information creation methods into four classes, notably guess, try, hint and learn, the review of the most important techniques in each class and the evaluation of some of them leads to the important conclusion that the side information creation methods provide better rate-distortion (RD) performance depending on the amount of temporal correlation in each video sequence. It became also clear that the best available Wyner-Ziv video coding solutions are almost systematically based on the learn approach. The best solutions are already able to systematically outperform the H.264/AVC Intra, and also the H.264/AVC zero-motion standard solutions for specific types of content. (C) 2013 Elsevier B.V. All rights reserved.
Resumo:
Pós-graduação em Ciência da Computação - IBILCE
Resumo:
Mestrado em Medicina Nuclear.