917 resultados para Optimization algorithm
Resumo:
10th Conference on Telecommunications (Conftele 2015), Aveiro, Portugal.
Resumo:
8th International Workshop on Multiple Access Communications (MACOM2015), Helsinki, Finland.
Resumo:
Trabalho apresentado no âmbito do Mestrado em Engenharia Informática, como requisito parcial para obtenção do grau de Mestre em Engenharia Informática
Resumo:
In this paper we present the operational matrices of the left Caputo fractional derivative, right Caputo fractional derivative and Riemann–Liouville fractional integral for shifted Legendre polynomials. We develop an accurate numerical algorithm to solve the two-sided space–time fractional advection–dispersion equation (FADE) based on a spectral shifted Legendre tau (SLT) method in combination with the derived shifted Legendre operational matrices. The fractional derivatives are described in the Caputo sense. We propose a spectral SLT method, both in temporal and spatial discretizations for the two-sided space–time FADE. This technique reduces the two-sided space–time FADE to a system of algebraic equations that simplifies the problem. Numerical results carried out to confirm the spectral accuracy and efficiency of the proposed algorithm. By selecting relatively few Legendre polynomial degrees, we are able to get very accurate approximations, demonstrating the utility of the new approach over other numerical methods.
Resumo:
Random amplified polymorphic DNA (RAPD) technique is a simple and reliable method to detect DNA polymorphism. Several factors can affect the amplification profiles, thereby causing false bands and non-reproducibility of assay. In this study, we analyzed the effect of changing the concentration of primer, magnesium chloride, template DNA and Taq DNA polymerase with the objective of determining their optimum concentration for the standardization of RAPD technique for genetic studies of Cuban Triatominae. Reproducible amplification patterns were obtained using 5 pmoL of primer, 2.5 mM of MgCl2, 25 ng of template DNA and 2 U of Taq DNA polymerase in 25 µL of the reaction. A panel of five random primers was used to evaluate the genetic variability of T. flavida. Three of these (OPA-1, OPA-2 and OPA-4) generated reproducible and distinguishable fingerprinting patterns of Triatominae. Numerical analysis of 52 RAPD amplified bands generated for all five primers was carried out with unweighted pair group method analysis (UPGMA). Jaccard's Similarity Coefficient data were used to construct a dendrogram. Two groups could be distinguished by RAPD data and these groups coincided with geographic origin, i.e. the populations captured in areas from east and west of Guanahacabibes, Pinar del Río. T. flavida present low interpopulation variability that could result in greater susceptibility to pesticides in control programs. The RAPD protocol and the selected primers are useful for molecular characterization of Cuban Triatominae.
Resumo:
Consumer-electronics systems are becoming increasingly complex as the number of integrated applications is growing. Some of these applications have real-time requirements, while other non-real-time applications only require good average performance. For cost-efficient design, contemporary platforms feature an increasing number of cores that share resources, such as memories and interconnects. However, resource sharing causes contention that must be resolved by a resource arbiter, such as Time-Division Multiplexing. A key challenge is to configure this arbiter to satisfy the bandwidth and latency requirements of the real-time applications, while maximizing the slack capacity to improve performance of their non-real-time counterparts. As this configuration problem is NP-hard, a sophisticated automated configuration method is required to avoid negatively impacting design time. The main contributions of this article are: 1) An optimal approach that takes an existing integer linear programming (ILP) model addressing the problem and wraps it in a branch-and-price framework to improve scalability. 2) A faster heuristic algorithm that typically provides near-optimal solutions. 3) An experimental evaluation that quantitatively compares the branch-and-price approach to the previously formulated ILP model and the proposed heuristic. 4) A case study of an HD video and graphics processing system that demonstrates the practical applicability of the approach.
Resumo:
Este projecto tem como objectivo a optimização das rotas dos técnicos de serviço após venda da Schmitt+Sohn Elevadores, associadas à realização das manutenções preventivas a cada elemento contratado à empresa (elevadores, escadas rolantes, etc). Como tal, é necessário fazer uma distribuição dos equipamentos que se encontram em carteira, por um dos técnicos que assegura a manutenção, pelos vários dias úteis de cada mês, e pelas horas de trabalho de cada dia. Apesar do técnico ter disponíveis, por dia, 8h de trabalho, apenas 6h podem ser preenchidas com manutenções preventivas. As 2h restantes são essencialmente para possíveis manutenções correctivas para as quais o técnico seja solicitado. Caso o técnico não seja contactado para resolver nenhuma avaria, essas horas podem ser utilizadas pelo mesmo para adiantar trabalho do dia seguinte, isto é, visitar já alguns dos próximos pontos de manutenção preventiva do dia seguinte, ou para compensar trabalho que esteja atrasado. De salientar que, para cada dia, as deslocações do técnico de qualquer local ao primeiro ponto de uma rota ou de regresso do último ponto de uma rota não são contabilizadas. O trabalho desenvolvido nesta dissertação pretende dar resposta ao problema apresentado pela Schmitt+Sohn Elevadores. Para isso foi desenvolvida uma heurística para a optimização das rotas dos técnicos. Esta é baseada no conceito de “vizinho mais próximo” que procura sempre o ponto que se apresenta mais perto do último ponto que foi adicionado à rota. Com base nesta metodologia, nos processos de escolha dos pontos que formam clusters, e na selecção dos pontos iniciais de cada uma das rotas diárias, a ferramenta de optimização resultante define as rotas diárias para que o percurso efectuado por cada técnico num mês seja o menor possível. São feitas alterações às rotas definidas inicialmente quando encontrados pontos de uma mesma entrada a serem visitados em dias diferentes. Isto obrigaria o técnico a fazer duas viagens ao mesmo local. Por fim, o resultado é apresentado num documento Word a ser utilizado pelo técnico como guia diário das suas deslocações aos equipamentos que necessitam de verificações periódicas. Os resultados obtidos foram comparados com as rotas que estavam a ser usadas pela empresa, tendo apresentado resultados de melhor qualidade, constatando-se a eficiência da solução criada pelo algoritmo proposto neste trabalho.
Resumo:
HHV-6 is the etiological agent of Exanthem subitum which is considered the sixth most frequent disease in infancy. In immuno-compromised hosts, reactivation of latent HHV-6 infection may cause severe acute disease. We developed a Sybr Green Real Time PCR for HHV-6 and compared the results with nested conventional PCR. A 214 pb PCR derived fragment was cloned using pGEM-T easy from Promega system. Subsequently, serial dilutions were made in a pool of negative leucocytes from 10-6 ng/µL (equivalent to 2465.8 molecules/µL) to 10-9 (equivalent to 2.46 molecules/µL). Dilutions of the plasmid were amplified by Sybr Green Real Time PCR, using primers HHV3 (5' TTG TGC GGG TCC GTT CCC ATC ATA 3)'and HHV4 (5' TCG GGA TAG AAA AAC CTA ATC CCT 3') and by conventional nested PCR using primers HHV1 (outer): 5'CAA TGC TTT TCT AGC CGC CTC TTC 3'; HHV2 (outer): 5' ACA TCT ATA ATT TTA GAC GAT CCC 3'; HHV3 (inner) and HHV4 (inner) 3'. The detection threshold was determined by plasmid serial dilutions. Threshold for Sybr Green real time PCR was 24.6 molecules/µL and for the nested PCR was 2.46 molecules/µL. We chose the Real Time PCR for diagnosing and quantifying HHV-6 DNA from samples using the new Sybr Green chemistry due to its sensitivity and lower risk of contamination.
Resumo:
In the traditional paradigm, the large power plants supply the reactive power required at a transmission level and the capacitors and transformer tap changer were also used at a distribution level. However, in a near future will be necessary to schedule both active and reactive power at a distribution level, due to the high number of resources connected in distribution levels. This paper proposes a new multi-objective methodology to deal with the optimal resource scheduling considering the distributed generation, electric vehicles and capacitor banks for the joint active and reactive power scheduling. The proposed methodology considers the minimization of the cost (economic perspective) of all distributed resources, and the minimization of the voltage magnitude difference (technical perspective) in all buses. The Pareto front is determined and a fuzzy-based mechanism is applied to present the best compromise solution. The proposed methodology has been tested in the 33-bus distribution network. The case study shows the results of three different scenarios for the economic, technical, and multi-objective perspectives, and the results demonstrated the importance of incorporating the reactive scheduling in the distribution network using the multi-objective perspective to obtain the best compromise solution for the economic and technical perspectives.
Resumo:
In this paper, we formulate the electricity retailers’ short-term decision-making problem in a liberalized retail market as a multi-objective optimization model. Retailers with light physical assets, such as generation and storage units in the distribution network, are considered. Following advances in smart grid technologies, electricity retailers are becoming able to employ incentive-based demand response (DR) programs in addition to their physical assets to effectively manage the risks of market price and load variations. In this model, the DR scheduling is performed simultaneously with the dispatch of generation and storage units. The ultimate goal is to find the optimal values of the hourly financial incentives offered to the end-users. The proposed model considers the capacity obligations imposed on retailers by the grid operator. The profit seeking retailer also has the objective to minimize the peak demand to avoid the high capacity charges in form of grid tariffs or penalties. The non-dominated sorting genetic algorithm II (NSGA-II) is used to solve the multi-objective problem. It is a fast and elitist multi-objective evolutionary algorithm. A case study is solved to illustrate the efficient performance of the proposed methodology. Simulation results show the effectiveness of the model for designing the incentive-based DR programs and indicate the efficiency of NSGA-II in solving the retailers’ multi-objective problem.
Resumo:
According to the new KDIGO (Kidney Disease Improving Global Outcomes) guidelines, the term of renal osteodystrophy, should be used exclusively in reference to the invasive diagnosis of bone abnormalities. Due to the low sensitivity and specificity of biochemical serum markers of bone remodelling,the performance of bone biopsies is highly stimulated in dialysis patients and after kidney transplantation. The tartrate-resistant acid phosphatase (TRACP) is an iso-enzyme of the group of acid phosphatases, which is highly expressed by activated osteoclasts and macrophages. TRACP in osteoclasts is in intracytoplasmic vesicles that transport the products of bone matrix degradation. Being present in activated osteoclasts, the identification of this enzyme by histochemistry in undecalcified bone biopsies is an excellent method to quantify the resorption of bone. Since it is an enzymatic histochemical method for a thermolabile enzyme, the temperature at which it is performed is particularly relevant. This study aimed to determine the optimal temperature for identification of TRACP in activated osteoclasts in undecalcified bone biopsies embedded in methylmethacrylate. We selected 10 cases of undecalcified bone biopsies from hemodialysis patients with the diagnosis of secondary hyperparathyroidism. Sections of 5 μm were stained to identify TRACP at different incubation temperatures (37º, 45º, 60º, 70º and 80ºC) for 30 minutes. Activated osteoclasts stained red and trabecular bone (mineralized bone) was contrasted with toluidine blue. This approach also increased the visibility of the trabecular bone resorption areas (Howship lacunae). Unlike what is suggested in the literature and in several international protocols, we found that the best results were obtained with temperatures between 60ºC and 70ºC. For technical reasons and according to the results of the present study, we recommended that, for an incubation time of 30 minutes, the reaction should be carried out at 60ºC. As active osteoclasts are usually scarce in a bone section, the standardization of the histochemistry method is of great relevance, to optimize the identification of these cells and increase the accuracy of the histomosphometric results. Our results, allowing an increase in osteoclasts contrast, also support the use of semi-automatic histomorphometric measurements.
Resumo:
This paper addresses the challenging task of computing multiple roots of a system of nonlinear equations. A repulsion algorithm that invokes the Nelder-Mead (N-M) local search method and uses a penalty-type merit function based on the error function, known as 'erf', is presented. In the N-M algorithm context, different strategies are proposed to enhance the quality of the solutions and improve the overall efficiency. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm. The main goal of this paper is to use a two-level factorial design of experiments to analyze the statistical significance of the observed differences in selected performance criteria produced when testing different strategies in the N-M based repulsion algorithm.
Resumo:
The Rural Postman Problem (RPP) is a particular Arc Routing Problem (ARP) which consists of determining a minimum cost circuit on a graph so that a given subset of required edges is traversed. The RPP is an NP-hard problem with significant real-life applications. This paper introduces an original approach based on Memetic Algorithms - the MARP algorithm - to solve the RPP and, also deals with an interesting Industrial Application, which focuses on the path optimization for component cutting operations. Memetic Algorithms are a class of Metaheuristics which may be seen as a population strategy that involves cooperation and competition processes between population elements and integrates “social knowledge”, using a local search procedure. The MARP algorithm is tested with different groups of instances and the results are compared with those gathered from other publications. MARP is also used in the context of various real-life applications.
Resumo:
Cloud data centers have been progressively adopted in different scenarios, as reflected in the execution of heterogeneous applications with diverse workloads and diverse quality of service (QoS) requirements. Virtual machine (VM) technology eases resource management in physical servers and helps cloud providers achieve goals such as optimization of energy consumption. However, the performance of an application running inside a VM is not guaranteed due to the interference among co-hosted workloads sharing the same physical resources. Moreover, the different types of co-hosted applications with diverse QoS requirements as well as the dynamic behavior of the cloud makes efficient provisioning of resources even more difficult and a challenging problem in cloud data centers. In this paper, we address the problem of resource allocation within a data center that runs different types of application workloads, particularly CPU- and network-intensive applications. To address these challenges, we propose an interference- and power-aware management mechanism that combines a performance deviation estimator and a scheduling algorithm to guide the resource allocation in virtualized environments. We conduct simulations by injecting synthetic workloads whose characteristics follow the last version of the Google Cloud tracelogs. The results indicate that our performance-enforcing strategy is able to fulfill contracted SLAs of real-world environments while reducing energy costs by as much as 21%.
Resumo:
Previously we have presented a model for generating human-like arm and hand movements on an unimanual anthropomorphic robot involved in human-robot collaboration tasks. The present paper aims to extend our model in order to address the generation of human-like bimanual movement sequences which are challenged by scenarios cluttered with obstacles. Movement planning involves large scale nonlinear constrained optimization problems which are solved using the IPOPT solver. Simulation studies show that the model generates feasible and realistic hand trajectories for action sequences involving the two hands. The computational costs involved in the planning allow for real-time human robot-interaction. A qualitative analysis reveals that the movements of the robot exhibit basic characteristics of human movements.